Compressive. Graphical Models. Volkan Cevher. Rice University ELEC 633 / STAT 631 Class

Similar documents
Compressive Sensing. A New Framework for Sparse Signal Acquisition and Processing. Richard Baraniuk. Rice University

Compressive Sensing. and Applications. Volkan Cevher Justin Romberg

Agenda. Pressure is on Digital Sensors

Short-course Compressive Sensing of Videos

Randomized Dimensionality Reduction

Compressive Sensing for High-Dimensional Data

Compressive Sensing: Theory and Practice

The Fundamentals of Compressive Sensing

Measurements and Bits: Compressed Sensing meets Information Theory. Dror Baron ECE Department Rice University dsp.rice.edu/cs

Combinatorial Selection and Least Absolute Shrinkage via The CLASH Operator

Signal Reconstruction from Sparse Representations: An Introdu. Sensing

Compressive Sensing: Opportunities and pitfalls for Computer Vision

ELEG Compressive Sensing and Sparse Signal Representations

Compressive Sensing. Part IV: Beyond Sparsity. Mark A. Davenport. Stanford University Department of Statistics

Sparse Reconstruction / Compressive Sensing

Compressive Parameter Estimation with Earth Mover s Distance via K-means Clustering. Dian Mo and Marco F. Duarte

COMPRESSIVE VIDEO SAMPLING

Introduction to Topics in Machine Learning

Compressive Sensing: Opportunities and Perils for Computer Vision

Non-Differentiable Image Manifolds

SPARSE SIGNAL RECONSTRUCTION FROM NOISY COMPRESSIVE MEASUREMENTS USING CROSS VALIDATION. Petros Boufounos, Marco F. Duarte, Richard G.

Structurally Random Matrices

Compressive Sensing for Multimedia. Communications in Wireless Sensor Networks

Modified Iterative Method for Recovery of Sparse Multiple Measurement Problems

Outline Introduction Problem Formulation Proposed Solution Applications Conclusion. Compressed Sensing. David L Donoho Presented by: Nitesh Shroff

Detection Performance of Radar Compressive Sensing in Noisy Environments

Image Reconstruction from Multiple Sparse Representations

Lecture 17 Sparse Convex Optimization

Compressed Sensing Algorithm for Real-Time Doppler Ultrasound Image Reconstruction

Compressed Sensing and Applications by using Dictionaries in Image Processing

Multiple-View Object Recognition in Band-Limited Distributed Camera Networks

Tutorial on Image Compression

Survey for Image Representation Using Block Compressive Sensing For Compression Applications

Randomized sampling strategies

Sparse Signals Reconstruction Via Adaptive Iterative Greedy Algorithm

Compressive Sensing based image processing in TrapView pest monitoring system

SPARSITY ADAPTIVE MATCHING PURSUIT ALGORITHM FOR PRACTICAL COMPRESSED SENSING

ECE 8201: Low-dimensional Signal Models for High-dimensional Data Analysis

Parameter Estimation in Compressive Sensing. Marco F. Duarte

A Study on Compressive Sensing and Reconstruction Approach

RECONSTRUCTION ALGORITHMS FOR COMPRESSIVE VIDEO SENSING USING BASIS PURSUIT

TERM PAPER ON The Compressive Sensing Based on Biorthogonal Wavelet Basis

Compressive Topology Identification of Interconnected Dynamic Systems via Clustered Orthogonal Matching Pursuit

Face Recognition via Sparse Representation

Compressive Sensing Applications and Demonstrations: Synthetic Aperture Radar

Compressive Sensing Based Image Reconstruction using Wavelet Transform

Deconvolution with curvelet-domain sparsity Vishal Kumar, EOS-UBC and Felix J. Herrmann, EOS-UBC

SPARSE signal representations provide a general signal

P257 Transform-domain Sparsity Regularization in Reconstruction of Channelized Facies

ADAPTIVE LOW RANK AND SPARSE DECOMPOSITION OF VIDEO USING COMPRESSIVE SENSING

Compressed Sensing and L 1 -Related Minimization

Compressive Imaging for Video Representation and Coding

Detecting Burnscar from Hyperspectral Imagery via Sparse Representation with Low-Rank Interference

Open Access Reconstruction Technique Based on the Theory of Compressed Sensing Satellite Images

Weighted-CS for reconstruction of highly under-sampled dynamic MRI sequences

The Benefit of Tree Sparsity in Accelerated MRI

An Iteratively Reweighted Least Square Implementation for Face Recognition

Signal and Image Recovery from Random Linear Measurements in Compressive Sampling

Adaptive step forward-backward matching pursuit algorithm

Signal Processing with Side Information

A* Orthogonal Matching Pursuit: Best-First Search for Compressed Sensing Signal Recovery

The convex geometry of inverse problems

Collaborative Sparsity and Compressive MRI

Learning based face hallucination techniques: A survey

Learning Splines for Sparse Tomographic Reconstruction. Elham Sakhaee and Alireza Entezari University of Florida

Robust image recovery via total-variation minimization

Image Reconstruction based on Block-based Compressive Sensing

Image reconstruction based on back propagation learning in Compressed Sensing theory

Clustered Compressive Sensing: Application on Medical Imaging

CoSaMP: Iterative Signal Recovery from Incomplete and Inaccurate Samples By Deanna Needell and Joel A. Tropp

Tomographic reconstruction: the challenge of dark information. S. Roux

The Viterbi Algorithm for Subset Selection

Compressive Sensing of High-Dimensional Visual Signals. Aswin C Sankaranarayanan Rice University

Recovery of Piecewise Smooth Images from Few Fourier Samples

CHAPTER 9 INPAINTING USING SPARSE REPRESENTATION AND INVERSE DCT

2D and 3D Far-Field Radiation Patterns Reconstruction Based on Compressive Sensing

Distributed Compressed Estimation Based on Compressive Sensing for Wireless Sensor Networks

Sparse Component Analysis (SCA) in Random-valued and Salt and Pepper Noise Removal

Main Menu. Summary. sampled) f has a sparse representation transform domain S with. in certain. f S x, the relation becomes

Algorithms for sparse X-ray CT image reconstruction of objects with known contour

Using. Adaptive. Fourth. Department of Graduate Tohoku University Sendai, Japan jp. the. is adopting. was proposed in. and

KSVD - Gradient Descent Method For Compressive Sensing Optimization

Compressive Sensing Algorithms for Fast and Accurate Imaging

Sparse wavelet expansions for seismic tomography: Methods and algorithms

Image Compression and Recovery through Compressive Sampling and Particle Swarm

Hydraulic pump fault diagnosis with compressed signals based on stagewise orthogonal matching pursuit

Compressed Sensing for Rapid MR Imaging

Optimal Sampling Geometries for TV-Norm Reconstruction of fmri Data

Inverse Problems and Machine Learning

The Smashed Filter for Compressive Classification and Target Recognition

A Compressive Sensing Approach for Expression-Invariant Face Recognition

IEEE TRANSACTIONS ON SIGNAL PROCESSING, VOL. 59, NO. 6, JUNE

Sparse Models in Image Understanding And Computer Vision

arxiv: v1 [stat.ml] 14 Jan 2017

A Novel Image Super-resolution Reconstruction Algorithm based on Modified Sparse Representation

Non-Parametric Bayesian Dictionary Learning for Sparse Image Representations

G Practical Magnetic Resonance Imaging II Sackler Institute of Biomedical Sciences New York University School of Medicine. Compressed Sensing

A fast algorithm for sparse reconstruction based on shrinkage, subspace optimization and continuation [Wen,Yin,Goldfarb,Zhang 2009]

Sparse Signal Reconstruction using Weight Point Algorithm

Sparse Solutions to Linear Inverse Problems. Yuzhe Jin

Transcription:

Compressive Sensing and Graphical Models Volkan Cevher volkan@rice edu volkan@rice.edu Rice University ELEC 633 / STAT 631 Class http://www.ece.rice.edu/~vc3/elec633/

Digital Revolution

Pressure is on Digital Sensors Success of digital data acquisition is placing increasing pressure on signal/image processing hardware and software to support higher resolution / denser sampling x» ADCs, cameras, imaging systems, microarrays, large numbers of sensors» image data bases, camera arrays, distributed wireless sensor networks, x increasing i numbers of modalities» acoustic, RF, visual, IR, UV, x-ray, gamma ray,

Pressure is on Digital Sensors Success of digital data acquisition is placing increasing pressure on signal/image processing hardware and software to support higher resolution / denser sampling x» ADCs, cameras, imaging systems, microarrays, large numbers of sensors» image data bases, camera arrays, distributed wireless sensor networks, x increasing i numbers of modalities» acoustic, RF, visual, IR, UV deluge of data» how to acquire, store, fuse, process efficiently?

Sensing by Sampling Long-established paradigm for digital data acquisition uniformly sample data at Nyquist rate (2x Fourier bandwidth) sample

Sensing by Sampling Long-established paradigm for digital data acquisition uniformly sample data at Nyquist rate (2x Fourier bandwidth) sample too much data!

Sensing by Sampling Long-established paradigm for digital data acquisition uniformly sample data at Nyquist rate (2x Fourier bandwidth) compress data sample compress transmit/store JPEG JPEG2000 receive decompress

Sparsity / Compressibility pixels large wavelet coefficients (blue = 0) wideband signal samples frequency large Gabor (TF) coefficients i time

Sample / Compress Long-established paradigm for digital data acquisition uniformly sample data at Nyquist rate compress data sample compress transmit/store sparse / compressible wavelet transform receive decompress

What s Wrong with this Picture? Why go to all the work to acquire N samples only to discard all but K pieces of data? sample compress transmit/store sparse / compressible wavelet transform receive decompress

Compressive Sensing Directly acquire compressed data Replace samples by more general measurements compressive sensing transmit/store receive reconstruct

Compressive Sensing Theory I Geometrical Perspective

Compressive Sensing Goal: Recover a sparse or compressible signal from measurements Problem: Random projection not full rank Solution: Exploit the sparsity/compressibility geometry of acquired signal

Compressive Sensing Goal: Recover a sparse or compressible signal from measurements iid Gaussian iid Bernoulli Problem: Random projection not full rank but satisfies Restricted t Isometry Property (RIP) Solution: Exploit the sparsity/compressibility geometry of acquired signal

Compressive Sensing Goal: Recover a sparse or compressible signal from measurements Problem: Random projection not full rank Solution: Exploit the model geometry of acquired signal

Concise Signal Structures Sparse signal: only K out of N coordinates nonzero model: union of K-dimensional subspaces aligned w/ coordinate axes sorted index

Concise Signal Structures Sparse signal: model: only K out of N coordinates nonzero Compressible signal: sorted coordinates decay rapidly to zero power-law decay sorted index

Concise Signal Structures Sparse signal: model: only K out of N coordinates nonzero Compressible signal: sorted coordinates decay rapidly to zero model: s-compressible K-term approximation error sorted index

Sampling Signal is -sparse in basis/dictionary WLOG assume sparse in space domain sparse signal nonzero entries ti

Sampling Signal is -sparse in basis/dictionary WLOG assume sparse in space domain Samples measurements sparse signal nonzero entries

Compressive Sampling When data is sparse/compressible, can directly acquire a condensed representation with no/little information loss through linear dimensionality reduction measurements sparse signal nonzero entries ti

How Can It Work? Projection not full rank and so loses information in general Ex: Infinitely many s map to the same

How Can It Work? Projection not full rank and so loses information in general columns But we are only interested in sparse vectors

How Can It Work? Projection not full rank and so loses information in general columns But we are only interested in sparse vectors is effectively MxK

How Can It Work? Projection not full rank and so loses information in general columns But we are only interested in sparse vectors Design so that each of its MxK submatrices are full rank

How Can It Work? Goal: Design so that its Mx2K submatrices are full rank columns difference between two K-sparse vectors is 2K sparse in general preserve information in K-sparse signals Restricted Isometry Property (RIP) of order 2K

Unfortunately columns Goal: Design so that its Mx2K submatrices are full rank (Restricted Isometry Property RIP) Unfortunately, a combinatorial, NP-complete design problem

Insight from the 80 s [Kashin, Gluskin] Draw at random iid Gaussian iid Bernoulli columns Then has the RIP with high probability as long as Mx2K submatrices are full rank stable embedding for sparse signals extends to compressible signals in balls

Compressive Data Acquisition Measurements = random linear combinations of the entries of WHP does not distort structure of sparse signals no information loss measurements sparse signal nonzero entries

Random projection not full rank CS Signal Recovery Recovery problem: given find Null space So search in null space for the best according to some criterion ex: least squares hyperplane at random angle

CS Signal Recovery Recovery: given (ill-posed inverse problem) find (sparse) fast pseudoinverse

CS Signal Recovery Recovery: given (ill-posed inverse problem) find (sparse) fast, wrong pseudoinverse

Why Doesn t Work for signals sparse in the space/time domain least squares, minimum solution is almost never sparse null space of translated to (random angle)

CS Signal Recovery Reconstruction/decoding: (ill-posed inverse problem) given find fast, wrong number of nonzero entries find sparsest in translated nullspace

CS Signal Recovery Reconstruction/decoding: (ill-posed inverse problem) given find fast, wrong correct: only M=2K measurements required to reconstruct t K-sparse signal number of nonzero entries

CS Signal Recovery Reconstruction/decoding: (ill-posed inverse problem) given find fast, wrong correct: only M=2K measurements required to reconstruct t K-sparse signal slow: NP-complete algorithm number of nonzero entries

CS Signal Recovery Recovery: given (ill-posed inverse problem) find (sparse) fast, wrong correct, slow correct, efficient mild oversampling [Candes, Romberg, Tao; Donoho] li linear program number of measurements required

CS Signal Recovery Recovery: given (ill-posed inverse problem) find (sparse) correct, slow correct, efficient mild oversampling [Candes, Romberg, Tao; Donoho] convex optimization CoSaMP correct, more efficient i IHT mild oversampling [Tropp and Needell; Blumensath and Davies] number of measurements required iterative greedy

CS Recovery via Linear Programming Optimization problem (Basis Pursuit BP) Standard linear program Make translations Yields

Why Works for signals sparse in the space/time domain minimum solution = sparsest solution (with high probability) if

Universality Random measurements can be used for signals sparse in any basis

Universality Random measurements can be used for signals sparse in any basis

Universality Random measurements can be used for signals sparse in any basis sparse coefficient vector nonzero entries

Compressive Sensing Directly acquire compressed data Replace N samples by M random projections random measurements transmit/store receive linear pgm

Compressive Sensing Recovery Algorithms

CS Recovery Algorithms Convex optimization: noise-free signals! Linear programming g (Basis pursuit)! FPC! Bregman iteration, noisy signals! Basis Pursuit De-Noising (BPDN)! Second-Order Cone Programming (SOCP)! Dantzig selector! GPSR, Iterative greedy algorithms Matching Pursuit (MP) Orthogonal Matching Pursuit (OMP) StOMP CoSaMP Iterative Hard Thresholding (IHT), software @ dsp.rice.edu/cs

SOCP Standard LP recovery Noisy measurements Second-Order d Cone Program Convex, quadratic program

BPDN Standard LP recovery Noisy measurements Basis Pursuit De-Noising i Convex, quadratic program

Matching Pursuit Greedy algorithm Key ideas: (1) measurements composed of sum of K columns of columns (2) identify which K columns sequentially according to size of contribution to

Matching Pursuit For each column compute Choose largest (greedy) Update estimate by adding in Form residual measurement and iterate until convergence

Orthogonal Matching Pursuit Same procedure as Matching Pursuit Except at each iteration: remove selected column re-orthogonalize the remaining columns of Converges in K iterations

Compressive Sensing Summary

CS Hallmarks CS changes the rules of the data acquisition game exploits a priori signal sparsity information Stable acquisition/recovery process is numerically stable Universal same random projections / hardware can be used for any compressible signal class (generic) Asymmetrical (most processing at decoder) conventional: smart encoder, dumb decoder CS: dumb encoder, smart decoder Random projections weakly encrypted

CS Hallmarks Democratic each measurement carries the same amount of information robust to measurement loss and quantization simple encoding Ex: wireless streaming application with data loss conventional: complicated (unequal) error protection of compressed data! DCT/wavelet low frequency coefficients CS: merely stream additional measurements and reconstruct using those that arrive safely (fountain-like)

Compressive Sensing Graphical Models

Sparsity sparse image Assumption: sparse/compressible wavelet et images background subtraction

Sparsity and Structure Assumption: sparse/compressible Reality: sparse/compressible with structure wavelet images Hidden Markov Trees background subtraction Markov Random Field/Ising Model [Duarte, Wakin and Baraniuk, 2005, 2008; La and Do, 2005, 2006; Lee and Bresler, 2008 ] [Cevher, Duarte, Hegde, Baraniuk, 2008]

Models for Sparse/Compressible Signals General models for diverse data types Restricted or probabilistic union of subspaces Graphical models Rooted Connected Trees for wavelet-sparse signals Markov Random Field/Ising Model for spatially clustered signals

Models for Sparse/Compressible Signals What can we expect? Less number of measurements Faster recovery Increased robustness and stability Rooted Connected Trees for wavelet-sparse signals Markov Random Field/Ising Model for spatially clustered signals [Baraniuk, Cevher, Duarte, Hegde, submitted to Trans on IT] [Cevher, Duarte, Hegde, Baraniuk, NIPS 2008]

Model-Sparse Signals a K-sparse signal a K model-sparse signal

Model-Sparse Signals a K-sparse signal a K model-sparse signal Rooted Connected Trees Rooted Connected Trees for wavelet-sparse signals

Model-Sparse Signals RIP requires a K-sparse signal: a K model-sparse signal: [Blumensath and Davies, submitted to Trans on IT] Rooted Connected Trees Rooted Connected Trees for wavelet-sparse signals

Model-Compressible Signals Model-based approximation error s-model-compressible signals

Convex problem Standard CS Recovery [Candes, Romberg, Tao; Donoho] o o] Guarantees

Model-based CS Recovery Non-convex problem Rooted Connected Trees for wavelet-sparse signals Markov Random Field/Ising Model for spatially clustered signals

Model-based CS Recovery Iterative solution algorithms (below is based on CoSaMP) calculate current residual form signal estimate calculate enlarged support estimate estimate signal for obtained support shrink support of obtained estimate During iterations, signal support must agree with the signal (graphical) model change support enlarging and shrinking steps to enforce the signal (graphical) model

Model-based CS Recovery Iterative solution algorithms (below is based on CoSaMP) calculate current residual form signal estimate calculate enlarged support estimate estimate signal for obtained support shrink support of obtained estimate Performance guarantees similar to convex optimization

Wavelet-tree for sample piecewise smooth signal

Tree-based Signal Recovery Heavisine N=1024 M=80 Signal CoSaMP!(RMSE=1.123) L1"minimization!(RMSE=0.751) Tree"based!recovery!(RMSE=0.037)

Monte Carlo Sims Wavelet Trees Tree"sparse piecewise cubic signals with <= 5 break points length = 1024 Tree sparse!piecewise!cubic!signals!with!<=!5!break!points,!length!=!1024 500!trials,!average!RMSE Model"based!recovers!at!3.5K;!CoSaMP!needs!5K!.

[Cevher, Duarte, Hegde, Baraniuk; NIPS 2008] Clustered Sparsity

A Vision Application: Background Subtraction Target LaMP CoSaMP FPC Lattice Matching Pursuit (LaMP)

LaMP Convergence target FPC CoSaMP 6.5 sec 6.2 sec LaMP iterations ti - 0.9 sec

Summary Compressive sensing exploits signal sparsity/compressibility information CS via graphical models provides novel research directions in optimization, learning, and information theory exploits structure to make CS better, stronger, and faster uses efficient iterative algorithms to solve certain classes of model-based CS recovery problems dsp.rice.edu/cs

dsp.rice.edu/cs