5. Mathematical tools: overview and examples in astronomy
|
|
- Stanley Briggs
- 5 years ago
- Views:
Transcription
1 Master ISTI / PARI / IV Introduction to Astronomical Image Processing 5. Mathematical tools: overview and examples in astronomy André Jalobeanu LSIIT / MIV / PASEO group Jan lsiit-miv.u-strasbg.fr/paseo PASEO
2 Mathematical tools: overview and examples in astronomy Direct tools (basic ops & filtering) Modeling (instrument/object description) Probability theory and statistics Introduction (pdfs, estimation, detection) Bayesian inference & graphical models Transforms and representations Projections Frequency Multiresolution Multi- shape and resolution Multidimensional data Functional optimization Mathematical morphology
3 Direct tools: basic operations & filtering Focus on operations involving single pixel or their neighbors Understand the simplest tools for image processing See how filtering can effectively be used in astronomy
4 Single pixel (elementwise) operations Image Addition Frame co-addition in deep field imaging (increase sensitivity) Image Subtraction Sky background and detector bias subtraction Registered images subtraction for comparison/detection purposes Image Division Flat-field correction Pixel value transformation thresholding functions Look-up tables for speed-up (integer values) Thresholding, contrast enhancement, dynamic range transforms Vector-valued pixels: change of basis (multispectral)
5 Multiple pixel operations: kernel filtering, interpolation Neighborhood filters Linear: Finite Impulse Response (FIR) filters/masks (3x3, 5x5...) Convolution with a kernel (image space) Sharpen, Blur, 1 st /2 nd Derivatives, discretized operators (Laplacian...) Rank filters (min, max, median) Sort pixel values within the pixel neighborhood Denoising (impulse noise) More complex: filter and decision (comparison, threshold...) Denoising, Detection... Interpolation Bilinear, Key s bicubic (1 step) Spline (2 steps: prefiltering-prediction) Synthesis function interpolant function [Unser 95] bilinear interpolation 1 y 0 x 1
6 Filtering: processing representations Keep the signal, filter the noise Work in a different representation (transform: Fourier, wavelet...) to separate signal & noise signal noise Linear filtering Multiply by a factor ak depending on the coefficient index k e.g. Fourier transform, factor = function of spatial frequency e.g. Wiener filter for image deblurring Nonlinear filtering Apply a nonlinear function f of the coefficient e.g. thresholding Stationary: function f does not depend on the coefficient index e.g. Denoising via simple wavelet coefficient thresholding Adaptive: function fk depends on the coefficient index e.g. Deblurring via adaptive wavelet coefficient thresholding
7 Modeling tools Find out how to describe astronomical objects in 3D or 2D Know the basics of forward modeling principles: instrument and sensor description
8 Tools for object description Geometry Simple objects (e.g. lines, spheres, ellipsoids) Polygonal objects (e.g. planetary surface) Parametric functions (1D, 2D, 3D) Simple functions (e.g. circular or ellipsoidal Gaussian) Complex analytic functions (e.g. radial sigmoid) Non-analytic functions (e.g. 2D integration of a 3D model)
9 Tools for image acquisition modeling y R y v R u I p p A Δ Φ Δ Δ R x x R v u sampling grid Image space Frequency space Sampling theory [Nyquist 28, Shannon 49] Regular sampling: regular grids (rectangular, hexagonal lattice) Generalized sampling on irregular grids Object-based rendering: model discretization Physics: optics, turbulence, motion, sensor Product of MTFs Analytic expressions or numerical models image spectrum Object-based rendering Hexagonal lattice Probability & statistics (noise) Airy pattern
10 Probability theory & statistics Give a short introduction to probability theory and the related statistical tools Understand the principles of Bayesian inference and estimation Get acquainted with complex modeling tools via graphical models
11 Introduction to probabilistic tools random variables and pdfs Random variables: stochastic processes probability density function (pdf) distribution theory constraints: positive, normalized discrete or continuous variables conditional marginal Joint pdf of several variables x,y random var. sets (e.g. pixels, parameters Θ) Marginal pdf: P(y) = Z x P(x,y)dx P(data) evidence P(Θ) prior Conditional pdf: Bayes theorem: P(x y) = P(x,y)/P(y) P(y x) = P(x y)p(y)/p(x) P(data Θ) likelihood P(Θ data) posterior
12 Statistical tools and function fitting Function fitting principles: Provide a parametric function (the model) Provide an error function Squared difference in the Gaussian Case Explicit -log P(datak Θ) in general Robust functions allowing for impulse noise statistical estimation: maximum likelihood arg maxθ P(data Θ) P(data Θ) =Πk P(datak Θ) indep. assumption Minimize the total error (cost function): regression General case: iterative method (see functional optimization) Special cases: closed-form solution Some examples: Smooth background extraction Model: linear or polynomial function of the coordinates Closed-form solution: low order moments Star extraction and aperture photometry Model: 2D Gaussian ( PSF); free parameters: location (x,y), intensity L Approx closed-form solution: 1 st order moments for (x,y) centroid, mean for L (the exact method is iterative) L = I i j i, j Gaussian noise least squares x = ii i j /L i, j y = i, j j I i j /L
13 Graphical models Independence properties of random variables Stochastic independence: P(x,y) = P(x) P(y) Conditional independence: P(x,y z) = P(x z) P(y z) Dependence graphs or graphical models Each node is a random variable (or a set of variables) Edges represent dependencies (stochastic independence no connexion between nodes) Directed graphs or Bayesian networks: set of converging arrows = conditional pdf (causality) Joint pdf: Undirected graphs or Markov networks: graph separation conditional independence Joint pdf: P(X) = t P(X t X π(t) ) P(X r ) parents P(X) = 1 Z e c V c (X c ) r roots clique potentials X1 X1 V12 [Jordan, MacKay] X2 X2 X4 Bayes X4 V23 X3 V34 X3
14 Markov Random Fields (MRF) Conditional dependence assumption: P(pixel all others) = P(pixel neighbors) Define a neighborhood and clique system [Besag 86, Geman 84] Define the clique potentials Vi Hammersley-Clifford theorem X MRF P(X) = e -U(X) /Z undirected graphical models (U: Gibbs energy = sum of Vi, Z: normalizing const.) first order neighborhood first order cliques A pixel only depends on its neighbors! Sampling from P(X) using Metropolis or Gibbs Use of Markov Chain Monte Carlo (MCMC) methods for inference & estimation
15 Hidden & auxiliary variables Hidden variables Underlying process explaining the complexity of a model Causality: directed graphical models (e.g. Gaussian mixture) Examples: class labels in segmentation, line process in adaptive regularization (denoising/deblurring), missing data (bad CCD pixels) Auxiliary variables Created to facilitate modeling, optimization or sampling Not necessarily causal relation Examples: edge-preserving regularization (denoising/deblurring) Joint P(X,M) Μ X 1 3 λ,δ B x X Y spatially variable model map M 2 Hidden variables: model map (optimal representations) X B x Edge processes B x B y
16 Markov trees & multiscale dependencies Encode dependencies between scales wavelet coefficients propagate through scales... Markov tree structure: parent/children dependencies Hidden variables: encode the implicit dependence the variance propagates, but the value can change Joint subband histogram P(ξ t, ξ π(t) )? Hidden Markov Tree (HMT) Multiscale, discrete Gaussian mixture model ξ π(t) ξ t wavelet transform P(s t,s π(t) ) discrete probabilities P(ξ t s t ) Gaussian pdf
17 Bayesian inference Express the joint posterior pdf Build the full joint pdf: entire model + observed data e.g. use graphical models P(entire model data) full joint pdf Marginalization: eliminate the nuisance variables elimination algorithm, integration (Laplace approx.) Result: posterior marginal P(Θ data) Usually there is no closed-form solution Monte Carlo Approx. (sample from P) Gaussian Approx. Bayesian vs. classical Mode = Maximum A Posteriori (MAP) approach: use prior P(Θ) found by optimization of -log P Uncertainties = Covariance Matrix [Σ] provided by the second derivatives of -log optimum Hypothesis testing, model assessment, model selection & mixture... Θ Θ z z Y Y
18 Bayesian estimation: when you have to make a decision Bayes Estimator: Statistics: function of the data θ=f(y) Cost function: estimation error C(θ,θ) Bayes risk: estimator error EP(θ,Y)[C(θ,θ)] Bayes estimator: arg minθ EP(θ,Y)[C(θ,θ)] Y=data cost fuction related estimator MAP Maximum A Posteriori PM Posterior Mean [Marroquin 85] In practice: Empirical Bayesian estimation (Laplace or saddlepoint approx.) [MacKay]
19 Transforms & representations Understand how changing representations can help model information in astronomical images Find out how to efficiently separate the signal from the noise Grasp the principles and properties of several multiresolution transforms
20 Projections onto various subspaces Project images, scalar or vector-valued pixels, geometric models, parameters... Simple projections: enforce constraints Strong support constraints (e.g. reduced spatial support) Strong range constraints (e.g. positivity) Projections onto convex sets Change of basis Orthonormal change of basis (e.g. Fourier, wavelets) Arbitrary change of basis (e.g. frames, biorthogonal wavelets) Overcomplete representations Arbitrary number of vectors (e.g. mixture of models)
21 Frequential representation (Fourier) Modeling Spatial convolution = single Fourier coefficient multiplication diagonalization of convolution-based operators e.g. stationary self-similar processes ( 1/f power spectrum) Filtering Stationary & independent noise: same in the freq. space Apply factor according to spatial frequency: Noise filtering, deconvolution, enhancement Reconstruction Blind deconvolution methods (unknown image and psf) Interferometry Detection Template matching via cross-correlation (convol. matched filter)
22 Multiresolution analysis Set of closed nested subspaces of Approximation a j at scale j : projection of f on V j Detail d j at scale j : proj. of f on W j such that V j-1 = V j W j Basis functions: scaling functions & wavelets [Mallat, Vetterli, Daubechies, etc.] Galaxy details at different resolutions or scales Starck, Murtagh, Bijaoui Multiresolution analysis & wavelets tutorial
23 Multiscale Vision Model & applications [Bijaoui 95] scale space Scale-space isosurface visualization First 6 scales of à trous wavelet transform Establish interscale connexions: links between regions Keeping only significant coefficients (3σ) Segmentation in significant regions
24 Wavelets & sparse representations Sparse representation: Good approximation achieved by keeping only a small number of coefficients; Information concentrated in a few, high magnitude coefficients pixels noise coef noise approx. Wavelet pyramid: few significant coefficients w.r.t. original image image space wavelet space connection with image compression Asymptote E~N -1/2 Haar Symmlet-8 Approximation error vs. number of coefficients
25 Multi shape/resolution representations One wavelet is not enough! Different wavelets, different shapes & properties Detection theory: correlation with template function to detect Use multiple templates examples of 2D wavelets examples of 1D wavelets Optimal representation Find pixons in images: circular or elliptical objects [Pina & Puetter 93] Various scales, locations and intensities Detect various objects in noisy observations...the residual should be the observation noise Find such objects in spectra (IFS) or in transform spaces: optimal representation (information theory)
26 Curse of dimensionality Problem: high dimensionality of pixels (multi or hyperspectral data) Statistical learning: number of bins = N d number of needed samples increases exponentially! typical data in integral field spectroscopy: 24x24 pixels, d=284 16x16 pixels, d= x80 pixels, d=2000 Solution: find low dimensional subspaces & project
27 Representing multidimensional data Principal Component Analysis (PCA) Image covariance matrix, eigenvectors = principal variation axes Select largest eigenvalues and the related eigenvectors Independent Component Analysis (ICA) Search for independent sources (non-gaussian!) Nonlinear manifold representation Deterministic/Probabilistic versions various modeling levels: with or without noise, single or mixture of principal components ICA nonlinear manifold
28 Functional optimization Understand why optimization is needed in most processing tasks Get to know the main optimization methods in multiple dimensions Know how to choose between various deterministic and stochastic algorithms
29 Why optimization is needed Direct, iterative methods No explicit definition of a forward model However, the solution is sought by an optimization procedure Constrained optimization of an objective or cost function; Simplest solution (e.g. max. entropy/smoothness) under data-related constraints: set-theoretic (e.g. positivity) or strong (e.g. data prediction) Most likely solution (e.g. least squares) under model-related constraints: set-theoretic (e.g. smoothness, bounds) or strong (e.g. normalization) Inverse, iterative methods Explicit forward modeling; no closed-form solution Solution given by optimizing a functional F: Probabilistic methods: F = - log posterior pdf Other methods: F = Distance(prediction, observations) Different kinds of optimization: image processing: image = arg min F parameter fitting: parameters = arg min F in some cases, functional F = processed image (e.g. CLEAN)
30 Deterministic methods Optimization without derivatives Simplex method (linear programming) Principal axis method (Brent) Gradient-based methods Steepest descent ( simple functions) Use line optimization only Newton s method (convex functions) Use the Hessian (2nd derivative) Quasi-Newton & modified Newton methods Use an approximation/correction of the Hessian Gauss-Newton methods (sum of squares) Conjugate gradient: linear/nonlinear (contours quadratic form) Linear case for quadratic form: conv. finite number of steps
31 Deterministic optimization: problems, solutions Objective function: high dimensionality & nonlinearity! Low computational efficiency Choose an appropriate optimization method! Take into account the dependence structure Choose the representation where the dependence is minimized (diagonalization) Perform single variable optimization whenever possible e.g. Iterative Conditional Modes (ICM) Multiple local optima, nonconvexity Multigrid optimization Use coarse to fine approximations of the objective function, initialize each level with the result of the previous level Graduated nonconvexity & approximations Auxiliary variables Stochastic methods... multigrid
32 Example - quasi-newton optimization Iterative optimization scheme: object & camera param. estimation! Linearize the intensity (rendering): I(S,Θ) I(S 0,Θ 0 ) + I S S S 0 ( ) + I Θ Θ Θ 0 ( ) objective function approx. by a quadratic form! optimization of S, Θ using a conjugate gradient! Result: used to initialize the next iteration! Convergence: small variation of S, Θ 1 2 Each step is simpler than the original problem (linear optimization) 3 S = geometric object model Θ = camera parameters
33 Auxiliary variable methods Half-quadratic extensions φ-functions : quadratic near 0, linear or log-like at Additive extension φ(u)=infb (b-u) 2 +ψ1(b) Multiplicative extension φ(u)=infb bu 2 +ψ2(b) Alternate optimizations w.r.t. b and u [Charbonnier 94] Missing data, augmented process: Expectation-Maximization [Baum 72, Dempster 77] Nonlinear regularization (e.g. deblurring or denoising) E M Q simpler to optimize than P(ϑ Y) complex P(ϑ Y) augmented process P(z, ϑ Y) z ϑ
34 Stochastic methods Exception: no optimization Bayesian inference: full pdf P or its properties Monte Carlo methods Sample from the distribution P e-u(x) where U=objective function Compute the mode and higher order moments: approx. the pdf P Simulated annealing [Kirkpatrick 83] Allow for wrong directions: escape from local optima Sample by slowly decreasing the temperature T in P e -U(X)/T e.g. Deblurring, blind deconvolution, optimal representations via nonlinear fitting
35 Genetic algorithms [Holland 75] Principle: Choose initial population Repeat crossover mutation - Evaluate the individual fitnesses of a certain proportion of the population - Select pairs of best-ranking individuals to reproduce - Apply crossover operator - Apply mutation operator Until terminating condition Examples: Multispectral image classification [Petremand et al. 05] Fitting galaxy rotation curves, variable star period determination [Charbonneau 95]
36 Derivatives & partial differential equations Learn how to compute the derivatives needed in deterministic optimization methods See how some iterative processing techniques amount to solving partial differential equations
37 Computing derivatives why compute the derivatives? series A deterministic optimization algorithms help compute uncertainties parallel A Example: rendering a polygonal object deriv. of a pixel intensity W w.r.t. model parameters Pj B C C A = C B B A B 1 B 2 C C A = C B 1 B 1 A + C B 2 B 2 A π i P j A π i W A 2D polygon vertex polygon area pixel intensity Basic tool: the chain rule... How does a change in the model affect the predicted image intensity?
38 Partial Differential Equations in image processing general form Image denoising Isotropic diffusion: heat equation t I = I e.g. semi-implicit scheme Related to physics: each pixel has a temperature Gaussian smoothing (convolution) of the image Anisotropic diffusion Smooth only along object edges Adaptive smoothing, edge-preserving, sharper details Connexions with nonlinear regularization noisy image isotropic anisotropic
39 Mathematical morphology Recall of the basic principles of mathematical morphology Understand how simple morphological tools can be applied to astronomical images
40 Principles [Matheron, Serra 82] Set-theoretic approach Structuring element (neighborhood system) - shape Sets of values, rank operations (e.g. min, max) Basic Functions Erosion: min {In} Dilation: max {In} Morphological tools Combination opening (erosion-dilation) / closing (dilation-erosion) / top-hat (opening-subtract.) Hit-and-miss, skeleton, reconstruction, thinning... Morphological filters Smoothing: open-close Gradient, Laplacian...
41 Application examples Star extraction and mapping Star/galaxy classification Multispectral image segmentation... top-hat transform: erosion, dilation, difference [Candéas et al, 1997] original image Abell 3698 top-hat: background & halo removal open-close: morphological smoothing
42 Further reading An interactive image processing course Mathworld (Wolfram research) IRIS tutorial on CCD image processing, C. Buil ICCV 03 course on learning & vision (Blake, Freeman, Bishop, Viola) Various wavelet resources MacKay Book on information theory, inference & learning Introduction to graphical models & bayesian networks Statistical learning, decision, graphical models (M. Jordan)
Natural image modeling using complex wavelets
Natural image modeling using complex wavelets André Jalobeanu Bayesian Learning group, NASA Ames - Moffett Field, CA Laure Blanc-Féraud, Josiane Zerubia Ariana, CNRS / INRIA / UNSA - Sophia Antipolis,
More informationFundamentals of Digital Image Processing
\L\.6 Gw.i Fundamentals of Digital Image Processing A Practical Approach with Examples in Matlab Chris Solomon School of Physical Sciences, University of Kent, Canterbury, UK Toby Breckon School of Engineering,
More informationFMA901F: Machine Learning Lecture 3: Linear Models for Regression. Cristian Sminchisescu
FMA901F: Machine Learning Lecture 3: Linear Models for Regression Cristian Sminchisescu Machine Learning: Frequentist vs. Bayesian In the frequentist setting, we seek a fixed parameter (vector), with value(s)
More informationWhat is machine learning?
Machine learning, pattern recognition and statistical data modelling Lecture 12. The last lecture Coryn Bailer-Jones 1 What is machine learning? Data description and interpretation finding simpler relationship
More informationEE795: Computer Vision and Intelligent Systems
EE795: Computer Vision and Intelligent Systems Spring 2012 TTh 17:30-18:45 WRI C225 Lecture 04 130131 http://www.ee.unlv.edu/~b1morris/ecg795/ 2 Outline Review Histogram Equalization Image Filtering Linear
More informationTheoretical Concepts of Machine Learning
Theoretical Concepts of Machine Learning Part 2 Institute of Bioinformatics Johannes Kepler University, Linz, Austria Outline 1 Introduction 2 Generalization Error 3 Maximum Likelihood 4 Noise Models 5
More informationIMAGE ANALYSIS, CLASSIFICATION, and CHANGE DETECTION in REMOTE SENSING
SECOND EDITION IMAGE ANALYSIS, CLASSIFICATION, and CHANGE DETECTION in REMOTE SENSING ith Algorithms for ENVI/IDL Morton J. Canty с*' Q\ CRC Press Taylor &. Francis Group Boca Raton London New York CRC
More informationDigital Image Processing
Digital Image Processing Third Edition Rafael C. Gonzalez University of Tennessee Richard E. Woods MedData Interactive PEARSON Prentice Hall Pearson Education International Contents Preface xv Acknowledgments
More informationSemantic Segmentation. Zhongang Qi
Semantic Segmentation Zhongang Qi qiz@oregonstate.edu Semantic Segmentation "Two men riding on a bike in front of a building on the road. And there is a car." Idea: recognizing, understanding what's in
More informationImage pyramids and their applications Bill Freeman and Fredo Durand Feb. 28, 2006
Image pyramids and their applications 6.882 Bill Freeman and Fredo Durand Feb. 28, 2006 Image pyramids Gaussian Laplacian Wavelet/QMF Steerable pyramid http://www-bcs.mit.edu/people/adelson/pub_pdfs/pyramid83.pdf
More informationImage Analysis, Classification and Change Detection in Remote Sensing
Image Analysis, Classification and Change Detection in Remote Sensing WITH ALGORITHMS FOR ENVI/IDL Morton J. Canty Taylor &. Francis Taylor & Francis Group Boca Raton London New York CRC is an imprint
More informationMachine Learning / Jan 27, 2010
Revisiting Logistic Regression & Naïve Bayes Aarti Singh Machine Learning 10-701/15-781 Jan 27, 2010 Generative and Discriminative Classifiers Training classifiers involves learning a mapping f: X -> Y,
More informationLevel-set MCMC Curve Sampling and Geometric Conditional Simulation
Level-set MCMC Curve Sampling and Geometric Conditional Simulation Ayres Fan John W. Fisher III Alan S. Willsky February 16, 2007 Outline 1. Overview 2. Curve evolution 3. Markov chain Monte Carlo 4. Curve
More informationCOMPUTER AND ROBOT VISION
VOLUME COMPUTER AND ROBOT VISION Robert M. Haralick University of Washington Linda G. Shapiro University of Washington A^ ADDISON-WESLEY PUBLISHING COMPANY Reading, Massachusetts Menlo Park, California
More informationLearning and Inferring Depth from Monocular Images. Jiyan Pan April 1, 2009
Learning and Inferring Depth from Monocular Images Jiyan Pan April 1, 2009 Traditional ways of inferring depth Binocular disparity Structure from motion Defocus Given a single monocular image, how to infer
More informationRegularization and Markov Random Fields (MRF) CS 664 Spring 2008
Regularization and Markov Random Fields (MRF) CS 664 Spring 2008 Regularization in Low Level Vision Low level vision problems concerned with estimating some quantity at each pixel Visual motion (u(x,y),v(x,y))
More informationFiltering Images. Contents
Image Processing and Data Visualization with MATLAB Filtering Images Hansrudi Noser June 8-9, 010 UZH, Multimedia and Robotics Summer School Noise Smoothing Filters Sigmoid Filters Gradient Filters Contents
More informationCapturing, Modeling, Rendering 3D Structures
Computer Vision Approach Capturing, Modeling, Rendering 3D Structures Calculate pixel correspondences and extract geometry Not robust Difficult to acquire illumination effects, e.g. specular highlights
More information10703 Deep Reinforcement Learning and Control
10703 Deep Reinforcement Learning and Control Russ Salakhutdinov Machine Learning Department rsalakhu@cs.cmu.edu Policy Gradient I Used Materials Disclaimer: Much of the material and slides for this lecture
More informationConvexization in Markov Chain Monte Carlo
in Markov Chain Monte Carlo 1 IBM T. J. Watson Yorktown Heights, NY 2 Department of Aerospace Engineering Technion, Israel August 23, 2011 Problem Statement MCMC processes in general are governed by non
More informationLoopy Belief Propagation
Loopy Belief Propagation Research Exam Kristin Branson September 29, 2003 Loopy Belief Propagation p.1/73 Problem Formalization Reasoning about any real-world problem requires assumptions about the structure
More informationProbabilistic Graphical Models
Overview of Part Two Probabilistic Graphical Models Part Two: Inference and Learning Christopher M. Bishop Exact inference and the junction tree MCMC Variational methods and EM Example General variational
More informationIntroduction to Image Super-resolution. Presenter: Kevin Su
Introduction to Image Super-resolution Presenter: Kevin Su References 1. S.C. Park, M.K. Park, and M.G. KANG, Super-Resolution Image Reconstruction: A Technical Overview, IEEE Signal Processing Magazine,
More informationCLASSIFICATION AND CHANGE DETECTION
IMAGE ANALYSIS, CLASSIFICATION AND CHANGE DETECTION IN REMOTE SENSING With Algorithms for ENVI/IDL and Python THIRD EDITION Morton J. Canty CRC Press Taylor & Francis Group Boca Raton London NewYork CRC
More informationBiometrics Technology: Image Processing & Pattern Recognition (by Dr. Dickson Tong)
Biometrics Technology: Image Processing & Pattern Recognition (by Dr. Dickson Tong) References: [1] http://homepages.inf.ed.ac.uk/rbf/hipr2/index.htm [2] http://www.cs.wisc.edu/~dyer/cs540/notes/vision.html
More informationOptimum Array Processing
Optimum Array Processing Part IV of Detection, Estimation, and Modulation Theory Harry L. Van Trees WILEY- INTERSCIENCE A JOHN WILEY & SONS, INC., PUBLICATION Preface xix 1 Introduction 1 1.1 Array Processing
More informationMulti-scale Statistical Image Models and Denoising
Multi-scale Statistical Image Models and Denoising Eero P. Simoncelli Center for Neural Science, and Courant Institute of Mathematical Sciences New York University http://www.cns.nyu.edu/~eero Multi-scale
More informationTracking Algorithms. Lecture16: Visual Tracking I. Probabilistic Tracking. Joint Probability and Graphical Model. Deterministic methods
Tracking Algorithms CSED441:Introduction to Computer Vision (2017F) Lecture16: Visual Tracking I Bohyung Han CSE, POSTECH bhhan@postech.ac.kr Deterministic methods Given input video and current state,
More informationImage Restoration. Diffusion Denoising Deconvolution Super-resolution Tomographic Reconstruction
Image Restoration Image Restoration Diffusion Denoising Deconvolution Super-resolution Tomographic Reconstruction Diffusion Term Consider only the regularization term E-L equation: (Laplace equation) Steepest
More informationBelief propagation and MRF s
Belief propagation and MRF s Bill Freeman 6.869 March 7, 2011 1 1 Undirected graphical models A set of nodes joined by undirected edges. The graph makes conditional independencies explicit: If two nodes
More informationLast week. Multi-Frame Structure from Motion: Multi-View Stereo. Unknown camera viewpoints
Last week Multi-Frame Structure from Motion: Multi-View Stereo Unknown camera viewpoints Last week PCA Today Recognition Today Recognition Recognition problems What is it? Object detection Who is it? Recognizing
More informationMotion Estimation and Optical Flow Tracking
Image Matching Image Retrieval Object Recognition Motion Estimation and Optical Flow Tracking Example: Mosiacing (Panorama) M. Brown and D. G. Lowe. Recognising Panoramas. ICCV 2003 Example 3D Reconstruction
More information22 October, 2012 MVA ENS Cachan. Lecture 5: Introduction to generative models Iasonas Kokkinos
Machine Learning for Computer Vision 1 22 October, 2012 MVA ENS Cachan Lecture 5: Introduction to generative models Iasonas Kokkinos Iasonas.kokkinos@ecp.fr Center for Visual Computing Ecole Centrale Paris
More informationApplied Bayesian Nonparametrics 5. Spatial Models via Gaussian Processes, not MRFs Tutorial at CVPR 2012 Erik Sudderth Brown University
Applied Bayesian Nonparametrics 5. Spatial Models via Gaussian Processes, not MRFs Tutorial at CVPR 2012 Erik Sudderth Brown University NIPS 2008: E. Sudderth & M. Jordan, Shared Segmentation of Natural
More informationImage Restoration using Markov Random Fields
Image Restoration using Markov Random Fields Based on the paper Stochastic Relaxation, Gibbs Distributions and Bayesian Restoration of Images, PAMI, 1984, Geman and Geman. and the book Markov Random Field
More informationBayesian Spherical Wavelet Shrinkage: Applications to Shape Analysis
Bayesian Spherical Wavelet Shrinkage: Applications to Shape Analysis Xavier Le Faucheur a, Brani Vidakovic b and Allen Tannenbaum a a School of Electrical and Computer Engineering, b Department of Biomedical
More informationPoint Lattices in Computer Graphics and Visualization how signal processing may help computer graphics
Point Lattices in Computer Graphics and Visualization how signal processing may help computer graphics Dimitri Van De Ville Ecole Polytechnique Fédérale de Lausanne Biomedical Imaging Group dimitri.vandeville@epfl.ch
More informationGraphical Models, Bayesian Method, Sampling, and Variational Inference
Graphical Models, Bayesian Method, Sampling, and Variational Inference With Application in Function MRI Analysis and Other Imaging Problems Wei Liu Scientific Computing and Imaging Institute University
More informationDigital Image Processing Laboratory: MAP Image Restoration
Purdue University: Digital Image Processing Laboratories 1 Digital Image Processing Laboratory: MAP Image Restoration October, 015 1 Introduction This laboratory explores the use of maximum a posteriori
More informationAlgorithms for Markov Random Fields in Computer Vision
Algorithms for Markov Random Fields in Computer Vision Dan Huttenlocher November, 2003 (Joint work with Pedro Felzenszwalb) Random Field Broadly applicable stochastic model Collection of n sites S Hidden
More informationCS 521 Data Mining Techniques Instructor: Abdullah Mueen
CS 521 Data Mining Techniques Instructor: Abdullah Mueen LECTURE 2: DATA TRANSFORMATION AND DIMENSIONALITY REDUCTION Chapter 3: Data Preprocessing Data Preprocessing: An Overview Data Quality Major Tasks
More informationBME I5000: Biomedical Imaging
1 Lucas Parra, CCNY BME I5000: Biomedical Imaging Lecture 11 Point Spread Function, Inverse Filtering, Wiener Filtering, Sharpening,... Lucas C. Parra, parra@ccny.cuny.edu Blackboard: http://cityonline.ccny.cuny.edu/
More informationHomework. Gaussian, Bishop 2.3 Non-parametric, Bishop 2.5 Linear regression Pod-cast lecture on-line. Next lectures:
Homework Gaussian, Bishop 2.3 Non-parametric, Bishop 2.5 Linear regression 3.0-3.2 Pod-cast lecture on-line Next lectures: I posted a rough plan. It is flexible though so please come with suggestions Bayes
More informationMesh segmentation. Florent Lafarge Inria Sophia Antipolis - Mediterranee
Mesh segmentation Florent Lafarge Inria Sophia Antipolis - Mediterranee Outline What is mesh segmentation? M = {V,E,F} is a mesh S is either V, E or F (usually F) A Segmentation is a set of sub-meshes
More informationAn Efficient Model Selection for Gaussian Mixture Model in a Bayesian Framework
IEEE SIGNAL PROCESSING LETTERS, VOL. XX, NO. XX, XXX 23 An Efficient Model Selection for Gaussian Mixture Model in a Bayesian Framework Ji Won Yoon arxiv:37.99v [cs.lg] 3 Jul 23 Abstract In order to cluster
More informationOutline 7/2/201011/6/
Outline Pattern recognition in computer vision Background on the development of SIFT SIFT algorithm and some of its variations Computational considerations (SURF) Potential improvement Summary 01 2 Pattern
More informationMachine Learning. Sourangshu Bhattacharya
Machine Learning Sourangshu Bhattacharya Bayesian Networks Directed Acyclic Graph (DAG) Bayesian Networks General Factorization Curve Fitting Re-visited Maximum Likelihood Determine by minimizing sum-of-squares
More informationChapter 3: Intensity Transformations and Spatial Filtering
Chapter 3: Intensity Transformations and Spatial Filtering 3.1 Background 3.2 Some basic intensity transformation functions 3.3 Histogram processing 3.4 Fundamentals of spatial filtering 3.5 Smoothing
More informationME/CS 132: Introduction to Vision-based Robot Navigation! Low-level Image Processing" Larry Matthies"
ME/CS 132: Introduction to Vision-based Robot Navigation! Low-level Image Processing" Larry Matthies" lhm@jpl.nasa.gov, 818-354-3722" Announcements" First homework grading is done! Second homework is due
More informationComputer vision: models, learning and inference. Chapter 10 Graphical Models
Computer vision: models, learning and inference Chapter 10 Graphical Models Independence Two variables x 1 and x 2 are independent if their joint probability distribution factorizes as Pr(x 1, x 2 )=Pr(x
More informationProbabilistic Graphical Models
School of Computer Science Probabilistic Graphical Models Theory of Variational Inference: Inner and Outer Approximation Eric Xing Lecture 14, February 29, 2016 Reading: W & J Book Chapters Eric Xing @
More informationCIS 520, Machine Learning, Fall 2015: Assignment 7 Due: Mon, Nov 16, :59pm, PDF to Canvas [100 points]
CIS 520, Machine Learning, Fall 2015: Assignment 7 Due: Mon, Nov 16, 2015. 11:59pm, PDF to Canvas [100 points] Instructions. Please write up your responses to the following problems clearly and concisely.
More information3.5 Filtering with the 2D Fourier Transform Basic Low Pass and High Pass Filtering using 2D DFT Other Low Pass Filters
Contents Part I Decomposition and Recovery. Images 1 Filter Banks... 3 1.1 Introduction... 3 1.2 Filter Banks and Multirate Systems... 4 1.2.1 Discrete Fourier Transforms... 5 1.2.2 Modulated Filter Banks...
More informationFrom multiple images to catalogs
Lecture 14 From multiple images to catalogs Image reconstruction Optimal co-addition Sampling-reconstruction-resampling Resolving faint galaxies Automated object detection Photometric catalogs Deep CCD
More informationMarkov Random Fields and Gibbs Sampling for Image Denoising
Markov Random Fields and Gibbs Sampling for Image Denoising Chang Yue Electrical Engineering Stanford University changyue@stanfoed.edu Abstract This project applies Gibbs Sampling based on different Markov
More information10-701/15-781, Fall 2006, Final
-7/-78, Fall 6, Final Dec, :pm-8:pm There are 9 questions in this exam ( pages including this cover sheet). If you need more room to work out your answer to a question, use the back of the page and clearly
More informationECG782: Multidimensional Digital Signal Processing
Professor Brendan Morris, SEB 3216, brendan.morris@unlv.edu ECG782: Multidimensional Digital Signal Processing Spring 2014 TTh 14:30-15:45 CBC C313 Lecture 06 Image Structures 13/02/06 http://www.ee.unlv.edu/~b1morris/ecg782/
More informationAll good things must...
Lecture 17 Final Review All good things must... UW CSE vision faculty Course Grading Programming Projects (80%) Image scissors (20%) -DONE! Panoramas (20%) - DONE! Content-based image retrieval (20%) -
More informationM. Sc. (Artificial Intelligence and Machine Learning)
Course Name: Advanced Python Course Code: MSCAI 122 This course will introduce students to advanced python implementations and the latest Machine Learning and Deep learning libraries, Scikit-Learn and
More informationImage Processing, Analysis and Machine Vision
Image Processing, Analysis and Machine Vision Milan Sonka PhD University of Iowa Iowa City, USA Vaclav Hlavac PhD Czech Technical University Prague, Czech Republic and Roger Boyle DPhil, MBCS, CEng University
More informationComputer Vision Group Prof. Daniel Cremers. 4. Probabilistic Graphical Models Directed Models
Prof. Daniel Cremers 4. Probabilistic Graphical Models Directed Models The Bayes Filter (Rep.) (Bayes) (Markov) (Tot. prob.) (Markov) (Markov) 2 Graphical Representation (Rep.) We can describe the overall
More informationNote Set 4: Finite Mixture Models and the EM Algorithm
Note Set 4: Finite Mixture Models and the EM Algorithm Padhraic Smyth, Department of Computer Science University of California, Irvine Finite Mixture Models A finite mixture model with K components, for
More information1.7.1 Laplacian Smoothing
1.7.1 Laplacian Smoothing 320491: Advanced Graphics - Chapter 1 434 Theory Minimize energy functional total curvature estimate by polynomial-fitting non-linear (very slow!) 320491: Advanced Graphics -
More informationA Wavelet Tour of Signal Processing The Sparse Way
A Wavelet Tour of Signal Processing The Sparse Way Stephane Mallat with contributions from Gabriel Peyre AMSTERDAM BOSTON HEIDELBERG LONDON NEWYORK OXFORD PARIS SAN DIEGO SAN FRANCISCO SINGAPORE SYDNEY»TOKYO
More informationPATTERN CLASSIFICATION AND SCENE ANALYSIS
PATTERN CLASSIFICATION AND SCENE ANALYSIS RICHARD O. DUDA PETER E. HART Stanford Research Institute, Menlo Park, California A WILEY-INTERSCIENCE PUBLICATION JOHN WILEY & SONS New York Chichester Brisbane
More informationAnno accademico 2006/2007. Davide Migliore
Robotica Anno accademico 6/7 Davide Migliore migliore@elet.polimi.it Today What is a feature? Some useful information The world of features: Detectors Edges detection Corners/Points detection Descriptors?!?!?
More informationFeature Extraction and Image Processing, 2 nd Edition. Contents. Preface
, 2 nd Edition Preface ix 1 Introduction 1 1.1 Overview 1 1.2 Human and Computer Vision 1 1.3 The Human Vision System 3 1.3.1 The Eye 4 1.3.2 The Neural System 7 1.3.3 Processing 7 1.4 Computer Vision
More informationReview for the Final
Review for the Final CS 635 Review (Topics Covered) Image Compression Lossless Coding Compression Huffman Interpixel RLE Lossy Quantization Discrete Cosine Transform JPEG CS 635 Review (Topics Covered)
More informationBayesian Methods in Vision: MAP Estimation, MRFs, Optimization
Bayesian Methods in Vision: MAP Estimation, MRFs, Optimization CS 650: Computer Vision Bryan S. Morse Optimization Approaches to Vision / Image Processing Recurring theme: Cast vision problem as an optimization
More informationx' = c 1 x + c 2 y + c 3 xy + c 4 y' = c 5 x + c 6 y + c 7 xy + c 8
1. Explain about gray level interpolation. The distortion correction equations yield non integer values for x' and y'. Because the distorted image g is digital, its pixel values are defined only at integer
More informationOptimal Denoising of Natural Images and their Multiscale Geometry and Density
Optimal Denoising of Natural Images and their Multiscale Geometry and Density Department of Computer Science and Applied Mathematics Weizmann Institute of Science, Israel. Joint work with Anat Levin (WIS),
More informationEECS 556 Image Processing W 09. Image enhancement. Smoothing and noise removal Sharpening filters
EECS 556 Image Processing W 09 Image enhancement Smoothing and noise removal Sharpening filters What is image processing? Image processing is the application of 2D signal processing methods to images Image
More informationLecture 8 Object Descriptors
Lecture 8 Object Descriptors Azadeh Fakhrzadeh Centre for Image Analysis Swedish University of Agricultural Sciences Uppsala University 2 Reading instructions Chapter 11.1 11.4 in G-W Azadeh Fakhrzadeh
More informationLecture 27, April 24, Reading: See class website. Nonparametric regression and kernel smoothing. Structured sparse additive models (GroupSpAM)
School of Computer Science Probabilistic Graphical Models Structured Sparse Additive Models Junming Yin and Eric Xing Lecture 7, April 4, 013 Reading: See class website 1 Outline Nonparametric regression
More informationPattern Recognition. Kjell Elenius. Speech, Music and Hearing KTH. March 29, 2007 Speech recognition
Pattern Recognition Kjell Elenius Speech, Music and Hearing KTH March 29, 2007 Speech recognition 2007 1 Ch 4. Pattern Recognition 1(3) Bayes Decision Theory Minimum-Error-Rate Decision Rules Discriminant
More informationEE795: Computer Vision and Intelligent Systems
EE795: Computer Vision and Intelligent Systems Spring 2012 TTh 17:30-18:45 FDH 204 Lecture 10 130221 http://www.ee.unlv.edu/~b1morris/ecg795/ 2 Outline Review Canny Edge Detector Hough Transform Feature-Based
More informationMathematics in Image Processing
Mathematics in Image Processing Michal Šorel Department of Image Processing Institute of Information Theory and Automation (ÚTIA) Academy of Sciences of the Czech Republic http://zoi.utia.cas.cz/ Mathematics
More informationComputer Vision I - Filtering and Feature detection
Computer Vision I - Filtering and Feature detection Carsten Rother 30/10/2015 Computer Vision I: Basics of Image Processing Roadmap: Basics of Digital Image Processing Computer Vision I: Basics of Image
More informationCPSC 340: Machine Learning and Data Mining. Principal Component Analysis Fall 2016
CPSC 340: Machine Learning and Data Mining Principal Component Analysis Fall 2016 A2/Midterm: Admin Grades/solutions will be posted after class. Assignment 4: Posted, due November 14. Extra office hours:
More informationGenerative and discriminative classification techniques
Generative and discriminative classification techniques Machine Learning and Category Representation 013-014 Jakob Verbeek, December 13+0, 013 Course website: http://lear.inrialpes.fr/~verbeek/mlcr.13.14
More informationComputer vision: models, learning and inference. Chapter 13 Image preprocessing and feature extraction
Computer vision: models, learning and inference Chapter 13 Image preprocessing and feature extraction Preprocessing The goal of pre-processing is to try to reduce unwanted variation in image due to lighting,
More informationCHAPTER 1 Introduction 1. CHAPTER 2 Images, Sampling and Frequency Domain Processing 37
Extended Contents List Preface... xi About the authors... xvii CHAPTER 1 Introduction 1 1.1 Overview... 1 1.2 Human and Computer Vision... 2 1.3 The Human Vision System... 4 1.3.1 The Eye... 5 1.3.2 The
More informationExpectation Propagation
Expectation Propagation Erik Sudderth 6.975 Week 11 Presentation November 20, 2002 Introduction Goal: Efficiently approximate intractable distributions Features of Expectation Propagation (EP): Deterministic,
More informationAugmented Reality VU. Computer Vision 3D Registration (2) Prof. Vincent Lepetit
Augmented Reality VU Computer Vision 3D Registration (2) Prof. Vincent Lepetit Feature Point-Based 3D Tracking Feature Points for 3D Tracking Much less ambiguous than edges; Point-to-point reprojection
More informationNorbert Schuff VA Medical Center and UCSF
Norbert Schuff Medical Center and UCSF Norbert.schuff@ucsf.edu Medical Imaging Informatics N.Schuff Course # 170.03 Slide 1/67 Objective Learn the principle segmentation techniques Understand the role
More informationCHAPTER 3 DIFFERENT DOMAINS OF WATERMARKING. domain. In spatial domain the watermark bits directly added to the pixels of the cover
38 CHAPTER 3 DIFFERENT DOMAINS OF WATERMARKING Digital image watermarking can be done in both spatial domain and transform domain. In spatial domain the watermark bits directly added to the pixels of the
More informationCoE4TN4 Image Processing. Chapter 5 Image Restoration and Reconstruction
CoE4TN4 Image Processing Chapter 5 Image Restoration and Reconstruction Image Restoration Similar to image enhancement, the ultimate goal of restoration techniques is to improve an image Restoration: a
More informationCOMPUTER AND ROBOT VISION
VOLUME COMPUTER AND ROBOT VISION Robert M. Haralick University of Washington Linda G. Shapiro University of Washington T V ADDISON-WESLEY PUBLISHING COMPANY Reading, Massachusetts Menlo Park, California
More informationExpectation-Maximization Methods in Population Analysis. Robert J. Bauer, Ph.D. ICON plc.
Expectation-Maximization Methods in Population Analysis Robert J. Bauer, Ph.D. ICON plc. 1 Objective The objective of this tutorial is to briefly describe the statistical basis of Expectation-Maximization
More informationEdge and local feature detection - 2. Importance of edge detection in computer vision
Edge and local feature detection Gradient based edge detection Edge detection by function fitting Second derivative edge detectors Edge linking and the construction of the chain graph Edge and local feature
More informationD-Separation. b) the arrows meet head-to-head at the node, and neither the node, nor any of its descendants, are in the set C.
D-Separation Say: A, B, and C are non-intersecting subsets of nodes in a directed graph. A path from A to B is blocked by C if it contains a node such that either a) the arrows on the path meet either
More informationArtistic Stylization of Images and Video Part III Anisotropy and Filtering Eurographics 2011
Artistic Stylization of Images and Video Part III Anisotropy and Filtering Eurographics 2011 Hasso-Plattner-Institut, University of Potsdam, Germany Image/Video Abstraction Stylized Augmented Reality for
More informationCS334: Digital Imaging and Multimedia Edges and Contours. Ahmed Elgammal Dept. of Computer Science Rutgers University
CS334: Digital Imaging and Multimedia Edges and Contours Ahmed Elgammal Dept. of Computer Science Rutgers University Outlines What makes an edge? Gradient-based edge detection Edge Operators From Edges
More informationRegularization by multigrid-type algorithms
Regularization by multigrid-type algorithms Marco Donatelli Department of Physics and Mathematics University of Insubria Joint work with S. Serra-Capizzano Outline 1 Restoration of blurred and noisy images
More informationComputer Vision 2. SS 18 Dr. Benjamin Guthier Professur für Bildverarbeitung. Computer Vision 2 Dr. Benjamin Guthier
Computer Vision 2 SS 18 Dr. Benjamin Guthier Professur für Bildverarbeitung Computer Vision 2 Dr. Benjamin Guthier 1. IMAGE PROCESSING Computer Vision 2 Dr. Benjamin Guthier Content of this Chapter Non-linear
More informationFMA901F: Machine Learning Lecture 6: Graphical Models. Cristian Sminchisescu
FMA901F: Machine Learning Lecture 6: Graphical Models Cristian Sminchisescu Graphical Models Provide a simple way to visualize the structure of a probabilistic model and can be used to design and motivate
More informationMEDICAL IMAGE ANALYSIS
SECOND EDITION MEDICAL IMAGE ANALYSIS ATAM P. DHAWAN g, A B IEEE Engineering in Medicine and Biology Society, Sponsor IEEE Press Series in Biomedical Engineering Metin Akay, Series Editor +IEEE IEEE PRESS
More informationDeep Generative Models Variational Autoencoders
Deep Generative Models Variational Autoencoders Sudeshna Sarkar 5 April 2017 Generative Nets Generative models that represent probability distributions over multiple variables in some way. Directed Generative
More informationWavelet Applications. Texture analysis&synthesis. Gloria Menegaz 1
Wavelet Applications Texture analysis&synthesis Gloria Menegaz 1 Wavelet based IP Compression and Coding The good approximation properties of wavelets allow to represent reasonably smooth signals with
More informationStatistical and Learning Techniques in Computer Vision Lecture 1: Markov Random Fields Jens Rittscher and Chuck Stewart
Statistical and Learning Techniques in Computer Vision Lecture 1: Markov Random Fields Jens Rittscher and Chuck Stewart 1 Motivation Up to now we have considered distributions of a single random variable
More information