Compressed Sensing and L 1 -Related Minimization

Similar documents
Iterative Shrinkage/Thresholding g Algorithms: Some History and Recent Development

Compressive Sensing Algorithms for Fast and Accurate Imaging

The Benefit of Tree Sparsity in Accelerated MRI

Efficient MR Image Reconstruction for Compressed MR Imaging

Signal Reconstruction from Sparse Representations: An Introdu. Sensing

A fast algorithm for sparse reconstruction based on shrinkage, subspace optimization and continuation [Wen,Yin,Goldfarb,Zhang 2009]

Compressive. Graphical Models. Volkan Cevher. Rice University ELEC 633 / STAT 631 Class

Compressive Sensing: Theory and Practice

Lecture 17 Sparse Convex Optimization

Compressed Sensing Algorithm for Real-Time Doppler Ultrasound Image Reconstruction

Total Variation Denoising with Overlapping Group Sparsity

Contents. I Basics 1. Copyright by SIAM. Unauthorized reproduction of this article is prohibited.

Introduction to Topics in Machine Learning

Modified Iterative Method for Recovery of Sparse Multiple Measurement Problems

Optimal Sampling Geometries for TV-Norm Reconstruction of fmri Data

Image Restoration with Compound Regularization Using a Bregman Iterative Algorithm

Sparse Reconstruction / Compressive Sensing

Structurally Random Matrices

Compressive Sensing based image processing in TrapView pest monitoring system

Learning Splines for Sparse Tomographic Reconstruction. Elham Sakhaee and Alireza Entezari University of Florida

Compressive Sensing MRI with Wavelet Tree Sparsity

A Parallel Algorithm for Compressive Sensing

Advanced phase retrieval: maximum likelihood technique with sparse regularization of phase and amplitude

Measurements and Bits: Compressed Sensing meets Information Theory. Dror Baron ECE Department Rice University dsp.rice.edu/cs

Compressive Sensing for High-Dimensional Data

IEEE TRANSACTIONS ON SIGNAL PROCESSING, VOL. 57, NO. 7, JULY (1) 1 A comprehensive, and frequently updated repository of CS literature and

ECE 8201: Low-dimensional Signal Models for High-dimensional Data Analysis

2. UNIFIED STRUCTURE OF REWEIGHTED l1

G Practical Magnetic Resonance Imaging II Sackler Institute of Biomedical Sciences New York University School of Medicine. Compressed Sensing

Iterative CT Reconstruction Using Curvelet-Based Regularization

An Iteratively Reweighted Least Square Implementation for Face Recognition

RECONSTRUCTION ALGORITHMS FOR COMPRESSIVE VIDEO SENSING USING BASIS PURSUIT

Sparse wavelet expansions for seismic tomography: Methods and algorithms

Compressive Sensing for Multimedia. Communications in Wireless Sensor Networks

Sparsity and image processing

Deconvolution with curvelet-domain sparsity Vishal Kumar, EOS-UBC and Felix J. Herrmann, EOS-UBC

Spatially-Localized Compressed Sensing and Routing in Multi-Hop Sensor Networks 1

Randomized sampling strategies

Incoherent noise suppression with curvelet-domain sparsity Vishal Kumar, EOS-UBC and Felix J. Herrmann, EOS-UBC

Full-waveform inversion using seislet regularization a

The Fundamentals of Compressive Sensing

Recent Advances inelectrical & Electronic Engineering

Image Restoration using Accelerated Proximal Gradient method

TOTAL VARIATION DENOISING WITH OVERLAPPING GROUP SPARSITY. Ivan W. Selesnick and Po-Yu Chen

Image reconstruction based on back propagation learning in Compressed Sensing theory

Efficient Implementation of the K-SVD Algorithm and the Batch-OMP Method

A New Non-Convex Regularized Sparse Reconstruction Algorithm for Compressed Sensing Magnetic Resonance Image Recovery

TERM PAPER ON The Compressive Sensing Based on Biorthogonal Wavelet Basis

PIPA: A New Proximal Interior Point Algorithm for Large-Scale Convex Optimization

Non-Differentiable Image Manifolds

Detecting Burnscar from Hyperspectral Imagery via Sparse Representation with Low-Rank Interference

Weighted-CS for reconstruction of highly under-sampled dynamic MRI sequences

Main Menu. Summary. sampled) f has a sparse representation transform domain S with. in certain. f S x, the relation becomes

Sparse sampling in MRI: From basic theory to clinical application. R. Marc Lebel, PhD Department of Electrical Engineering Department of Radiology

Supplemental Material for Efficient MR Image Reconstruction for Compressed MR Imaging

Compressed Sensing Reconstructions for Dynamic Contrast Enhanced MRI

A PRIMAL-DUAL FRAMEWORK FOR MIXTURES OF REGULARIZERS. LIONS - EPFL Lausanne - Switzerland

Image Reconstruction from Multiple Sparse Representations

Algorithms for sparse X-ray CT image reconstruction of objects with known contour

Collaborative Sparsity and Compressive MRI

Denoising Using Projection Onto Convex Sets (POCS) Based Framework

Compressive Sensing. A New Framework for Sparse Signal Acquisition and Processing. Richard Baraniuk. Rice University

Tomographic reconstruction: the challenge of dark information. S. Roux

Introduction. Wavelets, Curvelets [4], Surfacelets [5].

David G. Luenberger Yinyu Ye. Linear and Nonlinear. Programming. Fourth Edition. ö Springer

A fast iterative thresholding algorithm for wavelet-regularized deconvolution

Sparse modeling for image reconstruction in Astronomy. Makoto Uemura (Hiroshima University) Garching

Augmented Lagrangian Methods

Convex Optimization MLSS 2015

ELEG Compressive Sensing and Sparse Signal Representations

Clustered Compressive Sensing: Application on Medical Imaging

Compressed Sensing and Applications by using Dictionaries in Image Processing

Convex Optimization. Stephen Boyd

Algebraic Iterative Methods for Computed Tomography

P257 Transform-domain Sparsity Regularization in Reconstruction of Channelized Facies

Blind Deconvolution Problems: A Literature Review and Practitioner s Guide

Optimized Compressed Sensing for Curvelet-based. Seismic Data Reconstruction

Robust image recovery via total-variation minimization

COMPRESSED SENSING MRI WITH BAYESIAN DICTIONARY LEARNING.

Efficient Implementation of the K-SVD Algorithm using Batch Orthogonal Matching Pursuit

29 th NATIONAL RADIO SCIENCE CONFERENCE (NRSC 2012) April 10 12, 2012, Faculty of Engineering/Cairo University, Egypt

ADAPTIVE LOW RANK AND SPARSE DECOMPOSITION OF VIDEO USING COMPRESSIVE SENSING

Sparse coding for image classification

Compressive Sensing of High-Dimensional Visual Signals. Aswin C Sankaranarayanan Rice University

Detection Performance of Radar Compressive Sensing in Noisy Environments

KSVD - Gradient Descent Method For Compressive Sensing Optimization

Compressive Sensing. and Applications. Volkan Cevher Justin Romberg

Image denoising in the wavelet domain using Improved Neigh-shrink

Outline Introduction Problem Formulation Proposed Solution Applications Conclusion. Compressed Sensing. David L Donoho Presented by: Nitesh Shroff

Sparse Solutions to Linear Inverse Problems. Yuzhe Jin

A fast weighted stochastic gradient descent algorithm for image reconstruction in 3D computed tomography

Compressed Sensing for Electron Tomography

Robust Seismic Image Amplitude Recovery Using Curvelets

Sparse Component Analysis (SCA) in Random-valued and Salt and Pepper Noise Removal

Fast Image Recovery Using Variable Splitting and Constrained Optimization

COMPRESSIVE VIDEO SAMPLING

EE369C: Assignment 6

Randomized Dimensionality Reduction

Survey for Image Representation Using Block Compressive Sensing For Compression Applications

Inverse Problems and Machine Learning

Transcription:

Compressed Sensing and L 1 -Related Minimization Yin Wotao Computational and Applied Mathematics Rice University Jan 4, 2008 Chinese Academy of Sciences Inst. Comp. Math

The Problems of Interest Unconstrained Constrained They found applications in compressed sensing, statistics,... 2

Compressed Sensing We want to compress a signal Classical compression algorithms must know the signal first, but here the compression is done before signal is physically acquired The goal is to reduce the number of measurements u: a signal with lots of null entries Apply a linear transform A to u, #row(a)<#col(a) Physically acquire b = Au How do we get u from b when b = Au is underdetermined? When u is sufficiently sparse and b is long enough, exact reconstruction is possible (both in theory and in practice) 3

A test in MATLAB % Construct the compressed sensing problem n = 200; m = 100; A = randn(m,n); u = sprandn(n,1,.1); b = A*u; % least squares reconstruction % min u 2, subject to A*u=b u_recon = pinv(a)*b; 4

A test in MATLAB Compressed Sensing and L1-Related Minimization 1.5 1 0.5 0-0.5-1 -1.5-2 0 20 40 60 80 100 120 140 160 180 200 0.4 0.3 0.2 0.1 0-0.1-0.2-0.3-0.4-0.5-0.6 0 20 40 60 80 100 120 140 160 180 200 u u_recon 5

A test in MATLAB via YALMIP and CVX Candes and Tao, Donoho suggest >> x=sdpvar(n,1); >> solvesdp([a*x==b],norm(x,1));... >> u_recon=double(x); cvx_begin variable u_recon(n) minimize(norm(u_recon,1)) subject to A*u_recon==b cvx_end 6

A test in MATLAB Compressed Sensing and L1-Related Minimization 1.5 1.5 1 1 0.5 0.5 0 0-0.5-0.5-1 -1-1.5-1.5-2 0 20 40 60 80 100 120 140 160 180 200-2 0 20 40 60 80 100 120 140 160 180 200 u u_recon 7

A test in MATLAB: noisy case % Construct the compressed sensing problem >> n = 200; >> m = 100; >> A = randn(m,n); >> u = full(sprandn(n,1,.1)); >> b = A*u + 0.01*randn(m,1); % add noise >> epsilon = 0.02; 8

YALMIP and CVX Compressed Sensing and L1-Related Minimization >> x=sdpvar(n,1); >> solvesdp(norm(a*x-b)<=epsilon,norm(x,1));... >> u_recon=double(x); cvx_begin variable u_recon(n) minimize(norm(u_recon,1)) subject to norm(a*x-b,2)<=epsilon cvx_end 9

A test in MATLAB Compressed Sensing and L1-Related Minimization 1.5 1.5 1 1 0.5 0.5 0-0.5-1 0-0.5-1.5-1 -2 0 20 40 60 80 100 120 140 160 180 200-1.5 0 20 40 60 80 100 120 140 160 180 200 u u_recon 10

When is l 1 -reconstruction successful? Def: n=dim(u) signal size k= u 0 signal sparsity m=#.row(a) sample size When A is subgaussian (e.g. Gaussian, Bernoulli, etc.) (Mendelson et al) m O(k log (n/k)) When A is Fourier submatrix (Candes, Romberg and Tao, 2004) m O(k log (n)) mcan be even less using u p for p<1 minimization (Candes- Wakin-Boyd, Chartrand, and Chartrand-Y) 11

What if u is not sparse? u is not sparse in most cases, but it needs to be sparse in some sense Sparse under basis (e.g., Fourier, wavelet, curvelet) Sparse under linear transforms (overcomplete basis, not invertible) Sparse under nonlinear transforms (e.g., total variation) Has a low rank Has a low dimension embedding 12

Compressive Imaging Natural images u are sparse in the wavelet domain Φu is approximately sparse To recover u from Au=b, solve or 13

Introduction to Compressed Sensing Input Linear encoding Signal acquisition Signal reconstruction Signal representation 14

Compressed Sensing and L1-Related Minimization Rice Single-Pixel Camera (Wakin et al) Object Light Lens 1 DMD+ALP Board Lens 2 Photodiode circuit 15

TI Digital Micromirror Device (DMD) DLP 1080p --> 1920 x 1080 resolution 16

Compressed MR Imaging 17

Magnetic + RF 2D Fourier Transform 18

Magnetic + RF Inverse Fourier Transform 19

Magnetic + RF Measure a subset of Freq. Less time? 20

Compressed MR Imaging Data Model: b=ru; R is Fourier Input: frequency response b Reconstruction: R is full: solve b=ru, solution is u=r -1 b R is partial: b=ru is underdetermined u is sparse in the wavelet domain u has a small total variation (the l 1 norm of u) 21

Numerical Examples Compressed Sensing and L1-Related Minimization 22

Compressed Sensing Compressed Sensing and L1-Related Minimization Utilize the sparsity of a signal to reconstruct it from a smaller number of measurements than what is usually needed Procedure: encoding, sensing, decoding Encoding: nonadaptive, linear, random Sensing: acquiring measurements Decoding: using optimization to recover the signal Advantages: improve the capacities of physical devices Infra-red Imaging: higher resolution CT: less radiation dosage MRI: less time multi-sensor distributive network: higher throughput, longer battery time DNA microarrays: less cost Low-light imaging, microcopy, video acquisition, hyper-spectral image classification 23

The Challenges Such optimization problems Have large-scale and dense data (though not always) Are non-smooth However, It is the solution that is sparse 24

Classical l 1 solvers Most of them solve using simplex-type methods, interior-point, or Huber-norm approximation. Applied to geophysics and economics problems Invert or factorize a matrix involving B Next, recent solvers... 25

Different Formulations Compressed Sensing and L1-Related Minimization Unconstrained Basis pursuit LASSO Constrained 26

(A Subset of) Recent Algorithm Types Path-following: LARS, etc. Start from an easy problem Gradually transform the easy problem to the original one Solution path is piece-wise linear Solve one (small) system of linear equations for each breakpoint Specialized interior-point method: l1_ls More accurate than first-order methods Use truncated Newton s method and preconditioned conjugate gradient Operator splitting: Cheap per-iteration cost, more iterations Accelerated by line search and continuation Obtain optimal support quickly Gradient projection: GPSR, similar to operator splitting Bregman method: Originally for the constrained problem, finite convergence 27

Operator Splitting Observation: u 1 is nonsmooth but is separable Au-b 22 is smooth but non-separable Solution: apply different operations 28

Shrinkage, a component-wise separable operator: Shrinkage (soft thresholding) previously used by, as far as I know, Chambolle, DeVore, Lee and Lucier Figueiredo, Nowak, and Wright Daubechies, De Frise and DeMul Elad, Matalon and Zibulevsky Hale, Yin and Zhang Darbon and Osher Combettes and Pesquet MATLAB command: sign(v).*max(0, v -mu) 29

FPC (Hale, Y, and Zhang, 07) Method is not new, theorems are Theorem: Under mild conditions, the algorithm converges linearly. Theorem: The algorithm obtains the optimal support in a finite number of iterations. 30

For the constrained problem: Let µ go to 0, and solve the unconstrained problem Bregman Iterative Method (Y, Osher, Goldfarb, & Darbon, 2007) Theorem: The Bregman method converges to an exact solution in a finite number of steps for any µ>0. 31

Understanding the Unconstrained Problem 32

Understanding the Nonlinear Bregman Iterations 33

Understanding the Nonlinear Bregman Iterations 34

Understanding the Nonlinear Bregman Iterations 35

Understanding the Nonlinear Bregman Iterations 36

Understanding the Nonlinear Bregman Iterations Features: u k = u* for some finite k path depends on µ 37

Linearized Bregman Iterations Not converges finitely, but Code is extremely simple (MATLAB: 2 lines + stop. criteria) Larger components come out first, but code still fast to recover small components For 2 10 x2 20 problems, more like 40 seconds. 38

l p minimization versus l 1 minimization Rao-Kreutz-Delgado, FOCUSS 99, Chartrand 06, Candes-Wakin- Boyd 07 Chartrand-Y 07: 39

Iteratively Reweighted LS seems to be better than l 1 40

Sparsity under Nonlinear Transforms For total variation: Long story short, 3 existing choices: Operator splitting (Darbon-Osher, Wang-Y.-Zhang, 07) Solve using max-flow (Darbon-Sigelle, Chambolle, Goldfarb-Y) 41

s t 42

Sparsity under Nonlinear Transforms For total variation: Long story short, 3 existing choices: Operator splitting (Darbon-Osher, Wang-Y.-Zhang, 07) Solve using max-flow (Darbon-Sigelle, Chambolle, Goldfarb-Y) A different splitting (Wang-Y.-Zhang, 07) Legrender-Fenchel Transform / Duality (Y., 07) 43

44

45

Input Morphological Opening TV-L 1 46

47

48

49

Acknowledgements Special Thanks: 袁老师 and 戴老师 for their invitation and 刘昕 for his arrangement. Collaborators on L 1 -related minimization: Columbia: Don Goldfarb, Jianing Shi Rice CAAM: Yin Zhang, Elaine Hale, Yilun Wang Rice ECE: Rich Baraniuk, Kevin Kelly, etc. UCLA: Stan Osher, Jerome Darbon, Bin Dong, Yu Mao 50