Level-set MCMC Curve Sampling and Geometric Conditional Simulation

Similar documents
Curve Sampling and Geometric Conditional Simulation

Mesh segmentation. Florent Lafarge Inria Sophia Antipolis - Mediterranee

GAMES Webinar: Rendering Tutorial 2. Monte Carlo Methods. Shuang Zhao

Chapter 1. Introduction

MCMC Curve Sampling for Image Segmentation

Curve Sampling and Geometric Conditional Simulation

MCMC Shape Sampling for Image Segmentation with Nonparametric Shape Priors

Markov chain Monte Carlo methods

An Introduction to Markov Chain Monte Carlo

Convexization in Markov Chain Monte Carlo

Level Set Evolution without Reinitilization

IEEE TRANSACTIONS ON IMAGE PROCESSING, VOL. 17, NO. 5, MAY

MCMC Methods for data modeling

Active Geodesics: Region-based Active Contour Segmentation with a Global Edge-based Constraint

Efficient Feature Learning Using Perturb-and-MAP

Stochastic Function Norm Regularization of DNNs

MR IMAGE SEGMENTATION

Quantitative Biology II!

Markov Random Fields and Gibbs Sampling for Image Denoising

ADAPTIVE METROPOLIS-HASTINGS SAMPLING, OR MONTE CARLO KERNEL ESTIMATION

Automated Segmentation Using a Fast Implementation of the Chan-Vese Models

Theoretical Concepts of Machine Learning

CRF Based Point Cloud Segmentation Jonathan Nation

1486 IEEE TRANSACTIONS ON IMAGE PROCESSING, VOL. 14, NO. 10, OCTOBER 2005

CS334: Digital Imaging and Multimedia Edges and Contours. Ahmed Elgammal Dept. of Computer Science Rutgers University

Global Minimization of the Active Contour Model with TV-Inpainting and Two-Phase Denoising

CS534: Introduction to Computer Vision Edges and Contours. Ahmed Elgammal Dept. of Computer Science Rutgers University

10.4 Linear interpolation method Newton s method

Variational Methods II

weighted minimal surface model for surface reconstruction from scattered points, curves, and/or pieces of surfaces.

Filtering Images. Contents

Dr. Ulas Bagci

SCIENCE & TECHNOLOGY

Markov Chain Monte Carlo (part 1)

Graphical Models, Bayesian Method, Sampling, and Variational Inference

Overview. Monte Carlo Methods. Statistics & Bayesian Inference Lecture 3. Situation At End Of Last Week

NIH Public Access Author Manuscript Proc Soc Photo Opt Instrum Eng. Author manuscript; available in PMC 2014 October 07.

Markov Chain Monte Carlo Shape Sampling using Level Sets

Snakes, Active Contours, and Segmentation Introduction and Classical Active Contours Active Contours Without Edges

Biometrics Technology: Image Processing & Pattern Recognition (by Dr. Dickson Tong)

Supervised vs. Unsupervised Learning

1 Methods for Posterior Simulation

Region-based Segmentation

Short-Cut MCMC: An Alternative to Adaptation

ISSN: X International Journal of Advanced Research in Electronics and Communication Engineering (IJARECE) Volume 6, Issue 8, August 2017

Clustering. Discover groups such that samples within a group are more similar to each other than samples across groups.

Local or Global Minima: Flexible Dual-Front Active Contours

PATTERN CLASSIFICATION AND SCENE ANALYSIS

Digital Image Processing Fundamentals

Submitted by Wesley Snyder, Ph.D. Department of Electrical and Computer Engineering. North Carolina State University. February 29 th, 2004.

STATISTICS AND ANALYSIS OF SHAPE

10703 Deep Reinforcement Learning and Control

Digital Image Processing Laboratory: MAP Image Restoration

Non-Parametric Modeling

Image Segmentation. Shengnan Wang

Monte Carlo Methods and Statistical Computing: My Personal E

Dynamic Thresholding for Image Analysis

Stochastic Simulation: Algorithms and Analysis

Announcements. Edges. Last Lecture. Gradients: Numerical Derivatives f(x) Edge Detection, Lines. Intro Computer Vision. CSE 152 Lecture 10

Medical Image Segmentation using Level Sets

LEVEL SET ALGORITHMS COMPARISON FOR MULTI-SLICE CT LEFT VENTRICLE SEGMENTATION

SYDE Winter 2011 Introduction to Pattern Recognition. Clustering

MSA101/MVE Lecture 5

Lecture 12 Level Sets & Parametric Transforms. sec & ch. 11 of Machine Vision by Wesley E. Snyder & Hairong Qi

Approximate Bayesian Computation. Alireza Shafaei - April 2016

Let s start with occluding contours (or interior and exterior silhouettes), and look at image-space algorithms. A very simple technique is to render

An Active Contour Model without Edges

Unstructured Mesh Generation for Implicit Moving Geometries and Level Set Applications

CoE4TN4 Image Processing

Segmentation. Namrata Vaswani,

GiRaF: a toolbox for Gibbs Random Fields analysis

Yunyun Yang, Chunming Li, Chiu-Yen Kao and Stanley Osher. Speaker: Chiu-Yen Kao (Math Department, The Ohio State University) BIRS, Banff, Canada

Edge Detection and Active Contours

10-701/15-781, Fall 2006, Final

Last update: May 4, Vision. CMSC 421: Chapter 24. CMSC 421: Chapter 24 1

Sampling informative/complex a priori probability distributions using Gibbs sampling assisted by sequential simulation

Real-Time Visual Tracking Using Image Processing and Filtering Methods. Jin-cheol Ha

Multiple Motion and Occlusion Segmentation with a Multiphase Level Set Method

EE795: Computer Vision and Intelligent Systems

Mixture Models and the EM Algorithm

Snakes reparameterization for noisy images segmentation and targets tracking

Metropolis Light Transport

A Toolbox of Level Set Methods

Medial Scaffolds for 3D data modelling: status and challenges. Frederic Fol Leymarie

Probabilistic Graphical Models

Sensor Tasking and Control

Support Vector Machines.

MetaMorphs: Deformable Shape and Texture Models

THE preceding chapters were all devoted to the analysis of images and signals which

A Survey of Image Segmentation Based On Multi Region Level Set Method

Extract Object Boundaries in Noisy Images using Level Set. Literature Survey

COMPUTER AND ROBOT VISION

Instance-based Learning CE-717: Machine Learning Sharif University of Technology. M. Soleymani Fall 2015

Use of Extreme Value Statistics in Modeling Biometric Systems

Image Processing. Image Features

Method of Background Subtraction for Medical Image Segmentation

SNAP Haskell. Abraham Starosta March 18, 2016 Code: Abstract

Chapter 11 Representation & Description

Bayesian Statistics Group 8th March Slice samplers. (A very brief introduction) The basic idea

Deep Learning Shape Priors for Object Segmentation

Transcription:

Level-set MCMC Curve Sampling and Geometric Conditional Simulation Ayres Fan John W. Fisher III Alan S. Willsky February 16, 2007

Outline 1. Overview 2. Curve evolution 3. Markov chain Monte Carlo 4. Curve sampling 5. Conditional simulation 6. 2.5D Segmentation 2

Overview Curve evolution attempts to find a curve C (or curves C i ) that best segment an image (according to some model) Goal is to minimize an energy functional E(C) (view as a negative log likelihood) Find a local minimum using gradient descent 3

Sampling instead of optimization Draw multiple samples from a probability distribution p (e.g., uniform, Gaussian) Advantages: Naturally handles multi-modal distributions Can get out of local minima Higher-order statistics (e.g., variances) Conditional simulation 4

Outline 1. Overview 2. Curve evolution 3. Markov chain Monte Carlo 4. Curve sampling 5. Conditional simulation 6. 2.5D Segmentation 5

Planar curves A curve is a function We wish to minimize an energy functional with a data fidelity term and regularization term: This results in a gradient flow: We can write any flow in terms of the normal: 6

Euclidean curve shortening flow Let This energy functional is smaller when C is shorter Gradient flow is direction that minimizes the curve length the fastest Use Euler-Lagrange and we see where κ is curvature, N is the outward normal 7

Level-Set Methods A curve is a function (infinite dimensional) A natural implementation approach is to use marker points on the boundary (snakes) Reinitialization issues Difficulty handling topological change Level set methods instead evolve a surface (one dimension higher than our curve) whose zeroth level set is the curve (Sethian and Osher) 8

Embedding the curve Force level set Ψ to be zero on the curve Chain rule gives us 9

Popular energy functionals Geodesic active contours (Caselles et al.): Separating the means (Yezzi et al.): Piecewise constant intensities (Chan and Vese): 10

Examples-I 11

Examples-II 12

Outline 1. Overview 2. Curve evolution 3. Markov chain Monte Carlo 4. Curve sampling 5. Conditional simulation 6. 2.5D Segmentation 13

General MAP Model For segmentation: x is a curve y is the observed image (can be vector) S is a shape model Data model usually IID given the curve We wish to sample from p(x y;s), but cannot do so directly 14

Markov Chain Monte Carlo Class of sampling methods that iteratively generate candidates based on a previous iterate (forming a Markov chain) Instead of sampling from p(x y;s), sample from a proposal distribution q and keep samples according to an acceptance rule a Examples include Gibbs sampling, Metropolis-Hastings 15

Metropolis-Hastings Metropolis-Hastings algorithm: 1. Start with x 0 2. At time t, generate candidate ϕ t (given x t-1 ) 3. Calculate Hastings ratio: 4. Set x t = ϕ t with probability min(1, r t ), otherwise x t = x t-1 5. Go back to 2 16

Asymptotic Convergence We want to form a Markov chain such that its stationary distribution is p(x): To guarantee asymptotic convergence, sufficient conditions are: 1) Ergodicity 2) Detailed balance 17

Outline 1. Overview 2. Curve evolution 3. Markov chain Monte Carlo 4. Curve sampling 5. Conditional simulation 6. 2.5D Segmentation 18

MCMC Curve Sampling Generate perturbation on the curve: Sample by adding smooth random fields: Σ controls the degree of smoothness in field Note for portions where f is negative, shocks may develop (so called prairie fire model) Implement using white noise and circular convolution 19

Smoothness issues While detailed balance assures asymptotic convergence, may need to wait a very long time In this case, smooth curves have non-zero probability under q, but are very unlikely to occur Can view accumulation of perturbations as (h is the smoothing kernel, n i is white noise) Solution: make q more likely to move towards high-probability regions of p 20

Adding mean force We can add deterministic elements to f (i.e., a mean to q): The average behavior should then be to move towards higherprobability areas of p In the limit, setting f to be the gradient flow of the energy functional results in always accepting the perturbation 21

Coverage/Detailed balance It is easy to show we can go from any curve C 1 to any other curve C 2 (shrink to a point) For detailed balance, we need to compute probability of generating C from C (and vice versa) Probability of going from C to C is the probability of generating f (which is Gaussian) and the reverse is the probability of f (also Gaussian) 22

Approximations to q Relationship between f and f complicated due to the fact that the normal function changes f does not always exists (given an f). Unknown what conditions on f are necessary to guarantee existence. Various levels of exactness Assume (then f = -f ) Locally-linear approximation Trace along (technical issues) Unknown how approximations affect convergence 23

Synthetic noisy image Piecewise-constant observation model: Chan-Vese energy functional: Probability distribution (T=2σ 2 ): 24

Prostate in a Haystack 25

Most likely samples 26

Least likely samples 27

Confidence intervals 28

When best is not best In this example, the most likely samples under the model are not the most accurate according to the underlying truth 10%/90% confidence bands do a good job of enclosing the true answer Histogram image tells us more uncertainty in upper-right corner Median curve is quite close to the true curve Optimization would result in subpar results 29

Bias-corrected prostate Expert segmentation, add noise (simulate body coil image) 30

Learn probability densities Use histograms Learn pdf inside p(y 1) and pdf outside p(y 0) and assume iid given curve: 31

Initialization 32

Results 33

Prostate-only cluster 34

Multimodality and convergence Natural multimodal distribution When starting near one mode, need a lot of time to traverse valley between modes Clustering helps with presenting results Interesting work to be done in learning dimensionality of manifold and local approximations 35

Outline 1. Overview 2. Curve evolution 3. Markov chain Monte Carlo 4. Curve sampling 5. Conditional simulation 6. 2.5D Segmentation 36

User Information In many problems, the model admits many reasonable solutions Currently user input largely limited to initialization We can use user information to reduce the number of reasonable solutions Regions of inclusion or exclusion Partial segmentations Can help with both convergence speed and accuracy Interactive segmentation 37

Conditional simulation With conditional simulation, we are given the values on a subset of the variables We then wish to generate sample paths that fill in the remainder of the variables (e.g., simulating Brownian motion) 38

Simulating curves Say we are given C s, a subset of C (with some uncertainty associated with it) We wish to sample the unknown part of the curve C u One way to view is as sampling from: Difficulty is being able to evaluate the middle term as theoretically need to integrate p(c) 39

Simplifying Cases For special cases, evaluation of is tractable: 1. When C is low-dimensional (can do analytical integration or Monte-Carlo integration) 2. When C s is assumed to be exact 3. When p(c) has special form (e.g., independent, Markov) 4. When willing to approximate When implementing conditional simulation, modify q to be compatible with new conditional probability 40

Gravity inversion Supplement to standard seismic data to segment bottom salt using an array of surface gravimeters (~10-15 N accuracy) Subtract base effects (geoid, centrifugal force, etc.) to leave salt effects: Assume constant density inside and outside: Model energy as L2 estimation error (probability as Boltzmann distribution): 41

A strange segmentation problem X- and z- components of gravity profile for synthetic salt body 42

Circle salt 43

Most likely samples 44

Least likely samples 45

Confidence intervals 46

Notable features Measurement points << image pixels, but we can do a reasonable job Much higher uncertainty at the bottom than the top (weaker measurements) Less uncertainty in middle than on sides Median of histogram not necessarily related to median of distribution 47

More complex example Same x- and z- components of gravity Synthetic image with more complex geometry 48

Initialization 49

Most likely samples Regular Conditionally simulated 50

Marginal confidence bounds Regular Conditionally simulated 51

Aggregating samples 52

Outline 1. Overview 2. Curve evolution 3. Markov chain Monte Carlo 4. Curve sampling 5. Conditional simulation 6. 2.5D Segmentation 53

Chan-Vese in 3D Energy functional with surface area regularization: With a slice-based model, we can write the regularization term as: where is the surface between c i and c i+1 54

Zero-order hold approximation Approximate volume as piecewise-constant cylinders : Then we see that the surface areas are: We see terms related to the curve length and the difference between neighboring slices Upper bound to correct surface area 55

Overall regularization term Adding everything together results in: self potentials edge potentials 56

2.5D Approach In 3D world, natural (or built-in) partition of volumes into slices Assume Markov relationship among slices Then have local potentials (e.g., PCA) and edge potentials (coupling between slices) Naturally lends itself to local Metropolis-Hastings approach (iterating over the slices) 57

2.5D Model We can model this as a simple chain structure with pairwise interactions This admits the following factorization: 58

Partial segmentations Assume that we are given segmentations of every other slice We now want to sample surfaces conditioned on the fact that certain slices are fixed Markovianity tells us that c 2 and c 4 are independent conditioned on c 3 59

Log probability for c 2 We can then construct the probability for c 2 conditioned on its neighbors using the potential functions defined previously: 60

Results Neighbor segmentations Our result (cyan) with expert (green) 61

Larger spacing Apply local Metropolis-Hastings algorithm where we sample on a slice-by-slice basis Theory shows that asymptotic convergence is unchanged Unfortunately larger spacing similar to less regularization Currently have issues with poor data models that need to be resolved 62

Full 3D In medical imaging, have multiple slice orientations (axial, sagittal, coronal) In seismic, vertical and horizontal shape structure expected With a 2.5D approach, this introduces complexity to the graph structure 63

Incorporating perpendicular slices c is now coupled to all of the horizontal slices c only gives information on a subset of each slice Specify edge potentials as, e.g.: 64

Other extensions Additional features can be added to the base model: Uncertainty on the expert segmentations Shape models (semi-local) Exclusion/inclusion regions Topological change (through level sets) 65

Conclusion Computationally feasible algorithm to sample from space of curves Approximate detailed balance Demonstrated utility for robustness to noise, multimodal distributions, displaying uncertainty Can generate arbitrary shapes with relatively complex geometry Conditional simulation provides a natural framework to incorporate partial user segmentations on a slice-by-slice level 66

Further research Multiple curves, known/unknown topology We would like q to naturally sample from space of smooth curves (different perturbation structure) Speed always an issue for MCMC approaches Multiresolution perturbations Parameterized perturbations (efficient basis) Hierarchical models Using samples to explore the geometry of shape manifolds 67

Smooth curve + smooth perturbations smooth curve 68

SAD Target We define symmetric area difference (SAD) as: Use a Boltzmann distribution: T is a parameter we can use to control how likely we are to keep less likely samples We will keep a sample with T log(2) additional errors with probability ½ Single mode distribution 69

Target Shape 70

Initialization 71

Most likely samples 72

Least likely samples 73

Confidence intervals 74

Doubling the temperature 75

Results 76