ELEG Compressive Sensing and Sparse Signal Representations

Similar documents
IEEE TRANSACTIONS ON SIGNAL PROCESSING, VOL. 59, NO. 6, JUNE

Signal Reconstruction from Sparse Representations: An Introdu. Sensing

Introduction to Topics in Machine Learning

Compressive. Graphical Models. Volkan Cevher. Rice University ELEC 633 / STAT 631 Class

Sparse Reconstruction / Compressive Sensing

Modified Iterative Method for Recovery of Sparse Multiple Measurement Problems

Outline Introduction Problem Formulation Proposed Solution Applications Conclusion. Compressed Sensing. David L Donoho Presented by: Nitesh Shroff

Detection Performance of Radar Compressive Sensing in Noisy Environments

SPARSITY ADAPTIVE MATCHING PURSUIT ALGORITHM FOR PRACTICAL COMPRESSED SENSING

Variable Selection 6.783, Biomedical Decision Support

Compressive Parameter Estimation with Earth Mover s Distance via K-means Clustering. Dian Mo and Marco F. Duarte

Collaborative Sparsity and Compressive MRI

Structurally Random Matrices

P257 Transform-domain Sparsity Regularization in Reconstruction of Channelized Facies

Robust Regression. Robust Data Mining Techniques By Boonyakorn Jantaranuson

A Relationship between the Robust Statistics Theory and Sparse Compressive Sensed Signals Reconstruction

Sparse Signals Reconstruction Via Adaptive Iterative Greedy Algorithm

Combinatorial Selection and Least Absolute Shrinkage via The CLASH Operator

Sparse Solutions to Linear Inverse Problems. Yuzhe Jin

COMPRESSIVE VIDEO SAMPLING

A* Orthogonal Matching Pursuit: Best-First Search for Compressed Sensing Signal Recovery

The Benefit of Tree Sparsity in Accelerated MRI

Detecting Burnscar from Hyperspectral Imagery via Sparse Representation with Low-Rank Interference

Robust Principal Component Analysis (RPCA)

Section 5 Convex Optimisation 1. W. Dai (IC) EE4.66 Data Proc. Convex Optimisation page 5-1

Sparse wavelet expansions for seismic tomography: Methods and algorithms

Lecture 17 Sparse Convex Optimization

The Viterbi Algorithm for Subset Selection

EE613 Machine Learning for Engineers LINEAR REGRESSION. Sylvain Calinon Robot Learning & Interaction Group Idiap Research Institute Nov.

Algorithms for sparse X-ray CT image reconstruction of objects with known contour

The Fundamentals of Compressive Sensing

EE613 Machine Learning for Engineers LINEAR REGRESSION. Sylvain Calinon Robot Learning & Interaction Group Idiap Research Institute Nov.

SPARSE SIGNAL RECONSTRUCTION FROM NOISY COMPRESSIVE MEASUREMENTS USING CROSS VALIDATION. Petros Boufounos, Marco F. Duarte, Richard G.

When Sparsity Meets Low-Rankness: Transform Learning With Non-Local Low-Rank Constraint For Image Restoration

Randomized sampling strategies

Parallel and Distributed Sparse Optimization Algorithms

Face Recognition via Sparse Representation

arxiv: v2 [cs.it] 6 Jul 2013

Deconvolution with curvelet-domain sparsity Vishal Kumar, EOS-UBC and Felix J. Herrmann, EOS-UBC

Compressive Sensing for Multimedia. Communications in Wireless Sensor Networks

Outlier Pursuit: Robust PCA and Collaborative Filtering

ADAPTIVE LOW RANK AND SPARSE DECOMPOSITION OF VIDEO USING COMPRESSIVE SENSING

Sparsity Based Regularization

Image reconstruction based on back propagation learning in Compressed Sensing theory

Non-Parametric Bayesian Dictionary Learning for Sparse Image Representations

Image Reconstruction from Multiple Sparse Representations

IN some applications, we need to decompose a given matrix

Compressed Sensing Algorithm for Real-Time Doppler Ultrasound Image Reconstruction

Hydraulic pump fault diagnosis with compressed signals based on stagewise orthogonal matching pursuit

Learning Splines for Sparse Tomographic Reconstruction. Elham Sakhaee and Alireza Entezari University of Florida

Short-course Compressive Sensing of Videos

Compressive Sensing Applications and Demonstrations: Synthetic Aperture Radar

Compressive Sensing for High-Dimensional Data

The convex geometry of inverse problems

Weighted-CS for reconstruction of highly under-sampled dynamic MRI sequences

Signal and Image Recovery from Random Linear Measurements in Compressive Sampling

Main Menu. Summary. sampled) f has a sparse representation transform domain S with. in certain. f S x, the relation becomes

Sparsity and image processing

Multiresponse Sparse Regression with Application to Multidimensional Scaling

arxiv: v1 [stat.ml] 14 Jan 2017

DS Machine Learning and Data Mining I. Alina Oprea Associate Professor, CCIS Northeastern University

1. Introduction. performance of numerical methods. complexity bounds. structural convex optimization. course goals and topics

Compressive Sensing Based Image Reconstruction using Wavelet Transform

Tomographic reconstruction: the challenge of dark information. S. Roux

Bilevel Sparse Coding

Blind Compressed Sensing Using Sparsifying Transforms

The Pre-Image Problem in Kernel Methods

Open Access Reconstruction Technique Based on the Theory of Compressed Sensing Satellite Images

Sparse Component Analysis (SCA) in Random-valued and Salt and Pepper Noise Removal

Clustered Compressive Sensing: Application on Medical Imaging

ECE 8201: Low-dimensional Signal Models for High-dimensional Data Analysis

Compressed Sensing for Rapid MR Imaging

Robust Face Recognition via Sparse Representation

Subspace Matching Pursuit for Sparse. Unmixing of Hyperspectral Data

CLEANING UP TOXIC WASTE: REMOVING NEFARIOUS CONTRIBUTIONS TO RECOMMENDATION SYSTEMS

An R Package flare for High Dimensional Linear Regression and Precision Matrix Estimation

BSIK-SVD: A DICTIONARY-LEARNING ALGORITHM FOR BLOCK-SPARSE REPRESENTATIONS. Yongqin Zhang, Jiaying Liu, Mading Li, Zongming Guo

Convexization in Markov Chain Monte Carlo

A fast algorithm for sparse reconstruction based on shrinkage, subspace optimization and continuation [Wen,Yin,Goldfarb,Zhang 2009]

Compressive Sensing: Theory and Practice

COMPRESSIVE Sensing (CS) [2],[3],[4] is an effective. Distributed Compressive Sensing: A Deep Learning Approach. Hamid Palangi, Rabab Ward, Li Deng

G Practical Magnetic Resonance Imaging II Sackler Institute of Biomedical Sciences New York University School of Medicine. Compressed Sensing

FMA901F: Machine Learning Lecture 3: Linear Models for Regression. Cristian Sminchisescu

COMPRESSED SENSING WITH UNKNOWN SENSOR PERMUTATION. Valentin Emiya, Antoine Bonnefoy, Laurent Daudet, Rémi Gribonval

Incoherent noise suppression with curvelet-domain sparsity Vishal Kumar, EOS-UBC and Felix J. Herrmann, EOS-UBC

Iterative CT Reconstruction Using Curvelet-Based Regularization

CS 664 Structure and Motion. Daniel Huttenlocher

IMAGE DE-NOISING IN WAVELET DOMAIN

Vision par ordinateur

ComputerLab: compressive sensing and application to MRI

6 Model selection and kernels

Compressive Sensing based image processing in TrapView pest monitoring system

Parameter Estimation in Compressive Sensing. Marco F. Duarte

L1 REGULARIZED STAP ALGORITHM WITH A GENERALIZED SIDELOBE CANCELER ARCHITECTURE FOR AIRBORNE RADAR

Algebraic Iterative Methods for Computed Tomography

Iteratively Re-weighted Least Squares for Sums of Convex Functions

Distributed Compressed Estimation Based on Compressive Sensing for Wireless Sensor Networks

Compressive Topology Identification of Interconnected Dynamic Systems via Clustered Orthogonal Matching Pursuit

EE369C: Assignment 6

Introduction. Wavelets, Curvelets [4], Surfacelets [5].

Transcription:

ELEG 867 - Compressive Sensing and Sparse Signal Representations Gonzalo R. Arce Depart. of Electrical and Computer Engineering University of Delaware Fall 211 Compressive Sensing G. Arce Fall, 211 1 / 42

Outline Compressive sensing signal reconstruction Geometry of CS Reconstruction Algorithms Matching Pursuit Orthogonal Matching Pursuit Iterative Hard Thresholding Weighted Median Regression Compressive Sensing G. Arce Fall, 211 2 / 42

Compressive Sensing Signal Reconstruction Goal: Recover signal x from measurements y Problem: Random projectionφnot full rank (ill-posed inverse problem) Solution: Exploit the sparse/compressible geometry of acquired signal x y Φ x Compressive Sensing G. Arce CS Signal Reconstruction Fall, 211 3 / 42

Suppose there is a S-sparse solution to y = Φx Combinatorial optimization problem Convex optimization problem (P) min x x s.t. Φx = y (P1) min x x 1 s.t. Φx = y If 2δ 3s +δ 2s 1, the solutions(p) and (P1) are the same. E. Candès. Compressive Sampling. Proc. Intern. Congress of Math., China, April 26. Compressive Sensing G. Arce CS Signal Reconstruction Fall, 211 4 / 42

The Geometry of CS Sparse signals have few non-zero coefficients X 1 X 1 R 3 R 3 X 2 X 2 X 3 1-sparse X 3 2-sparse Compressive Sensing G. Arce Geometry of CS Fall, 211 5 / 42

l Recovery Reconstruction should be Consistent with the model: x as sparse as possible min x Consistent with the measurements: y = Φx X 1 R 3 x X 2 X 3 x:y= Φx Compressive Sensing G. Arce Geometry of CS Fall, 211 6 / 42

l Recovery min x x s.t. Φx = y x number of nonzero elements Sparsest signal consistent with the measurements y Requires only M << N measurements Combinatorial NP-hard problem: for x R N with sparsity S, the complexity is O(N S ) Too slow for implementation Compressive Sensing G. Arce Geometry of CS Fall, 211 7 / 42

l 1 Recovery A more realistic model min x x 1 s.t. Φx = y Thel 1 norm also induces sparsity. The constraint is given by y = Φx. X 1 R 3 x X 2 X 3 x:y= Φx Compressive Sensing G. Arce Geometry of CS Fall, 211 8 / 42

Phase Transition Diagram In thel 1 minimization, there is a defined region on (M/N, S/M) which ensures successful recovery 1.9.8.7.6.5.4.3.2.1 1.9.8.7.6.5.4.3.2.1.1.2.3.4.5.6.7.8.9 1 δ = M/N is a normalized measure of the problem indeterminacy. ρ = S/M is a normalized measure of the sparsity. Red region - unsuccessful recovery or exact reconstruction typically fails. Blue region - successful recovery or exact reconstruction typically occurs. Compressive Sensing G. Arce Geometry of CS Fall, 211 9 / 42

l 2 Recovery min x x 2 s.t. Φx = y Least square solution x = (Φ T Φ) 1 Φ T y Solved by using quadratic programming: Least squares solution Interior-point methods Solution is almost never sparse X 1 R 3 x' x X 2 X 3 x:y= Φx Compressive Sensing G. Arce Geometry of CS Fall, 211 1 / 42

l 2 Recovery Problem: smalll 2 does not imply sparsity x x' x has smalll 2 norm but it is not sparse Compressive Sensing G. Arce Geometry of CS Fall, 211 11 / 42

CS Example in the Time Domain y = Φx Random Projection Time domain sparse signal Nonlinear Reconstruction min x 1 s.t. y = Φx Error signal Compressive Sensing G. Arce Geometry of CS Fall, 211 12 / 42

CS Example in the Wavelet Domain Reconstruction of an image (N = 1 megapixel) sparse (S = 25, ) in the wavelet domain from M = 96, incoherent measurements. Original (25K wavelets) Recovered Image E. J. Candès and J. Romberg Sparsity and Incoherence in Compressive Sampling. Inverse Problems. vol.23, pp.969-985. 26. Compressive Sensing G. Arce Geometry of CS Fall, 211 13 / 42

Reconstruction Algorithms Different formulations and implementations have been proposed to find the sparsest x subject to y = Φx Difficult to compare results obtained by different methods Those are broadly classified in: Regularization formulations (Replace combinatorial problem with convex optimization) Greedy algorithms (Iterative refinement of a sparse solution) Bayesian framework (Assume prior distribution of sparse coefficients) Compressive Sensing G. Arce Reconstruction Algorithms Fall, 211 14 / 42

Greedy Algorithms Iterative algorithms that select an optimal subset of the desired signal x at each iteration. Some examples: Matching Pursuit (MP) min x x s.t. Φx = y Orthogonal Matching Pursuit(OMP) min x x s.t. Φx = y Compressive Sampling Matching Pursuit (CoSaMP) min x y Φx 2 2 s.t. x S Iterative Hard Thresholding (ITH) min x y Φx 2 2 s.t. x S Weighted Median Regression min x y Φx 1 +λ x Those methods build an approximation of the sparse signal at each iteration. Compressive Sensing G. Arce Reconstruction Algorithms Fall, 211 15 / 42

Matching Pursuit (MP) Sparse approximation algorithm used to find the solution to Key ideas: min x x s.t. Φx = y Measurements y = Φx are composed of sum of S columns ofφ. The algorithm needs to identify which S columns contribute to y. y x S Compressive Sensing G. Arce Matching Pursuit Fall, 211 16 / 42

Matching Pursuit (MP) Algorithm Input and Observation vector y Initialize Measurement MatrixΦ Parameters Initial Solution x () = Initial residual r () = y Iteration k Compute the current error: c (k) j = φ j, r (k) Identify the index ĵ such that: ĵ = max j c (k) Update x (k) ĵ = x (k 1) ĵ + c (k) ĵ Update r (k) = r (k 1) c (k) φĵ ĵ j D. Donoho et. al, SparseLab Version 2.. 27. Compressive Sensing G. Arce Matching Pursuit Fall, 211 17 / 42

MP Reconstruction Example 1.6.8.6.4.4.2.2.2.2.4.6.8 1 5 1 15 2 25 3 reconstruc. signal Iteration.4.6.8 5 1 15 residue - Iteration 1.8.15 1.6 1.4 1.2 1.1.5.8.6.4.2.5.1.15 5 1 15 2 25 3 reconstruc. signal Iteration 5.2 5 1 15 residue - Iteration 5 Compressive Sensing G. Arce Matching Pursuit Fall, 211 18 / 42

MP Reconstruction Example 1.8 1.8 1.6 1.4 1.2 1.6 1.4 1.2 1 1.8.8.6.6.4.4.2.2 5 1 15 2 25 3 Original Signal 5 1 15 2 25 3 reconstruc. signal Iteration 32 8 x 1 5 6 4 2 2 4 6 5 1 15 Residue Iteration 32 Compressive Sensing G. Arce Matching Pursuit Fall, 211 19 / 42

Orthogonal Matching Pursuit (MP) Sparse approximation algorithm used to find the solution to Key ideas: min x x s.t. Φx = y Measurements y = Φx are composed of sum of S columns ofφ. The algorithm needs to identify which S columns contribute to y. y x S Compressive Sensing G. Arce Orthogonal Matching Pursuit Fall, 211 2 / 42

Orthogonal Matching Pursuit (OMP) Algorithm Input and Observation vector y Initialize Measurement Matrix Φ Parameters Initial Solution x () = Initial Solution Support S () =support{ x () } = Initial residual r () = y Iteration k Compute the current error: c (k) j = φ j, r (k) Identify the index ĵ such that: ĵ = max j c (k) j Update the support S (k) =S (k 1) ĵ Update the matrixφ S (k) = [,...,φ i,...,,...,φ j,..., ] Update the solution x (k) = (Φ T Φ S (k) S (k)) 1 Φ T y S (k) Update r (k) = y Φ S (k) x (k) D. Donoho et. al, SparseLab Version 2.. 27. Compressive Sensing G. Arce Orthogonal Matching Pursuit Fall, 211 21 / 42

Remarks on the OMP Algorithm S (k) is the support of x at the iteration k and Φ S (k) is the matrix that contains the columns fromφthat belongs to this support. The updated solution gives the x (k) that solves the minimization problem Φ S (k)x y 2 2. The solution is given by Φ T S (k) (Φ S (k)x y) = Φ T S (k) r (k) = This solution shows that the columns inφ T S (k) are orthogonal to r (k). Compressive Sensing G. Arce Orthogonal Matching Pursuit Fall, 211 22 / 42

OMP Reconstruction Example 1 5.8.6.4.2.2.4.6.8 4 3 2 1 1 2 3 4 1 5 1 15 2 25 3 1.8.6.4 reconstruc. signal Iteration 5 5 1 15 4 3 2 residue - Iteration.2.2 1.4.6.8 1 2 1 5 1 15 2 25 3 reconstruc. signal Iteration 5 3 5 1 15 residue - Iteration 5 Compressive Sensing G. Arce Orthogonal Matching Pursuit Fall, 211 23 / 42

OMP Reconstruction Example 1 1.8.6.4.2.2.4.6.8.8.6.4.2.2.4.6.8 1 5 1 15 2 25 3 reconstruc. signal Iteration 1 5 1 15 residue - Iteration 9 1 1.5 x 1 15.8.6 1.4.2.5.2.4.6.5.8 1 5 1 15 2 25 3 reconstruc. signal Iteration 1 1 5 1 15 residue - Iteration 1 Compressive Sensing G. Arce Orthogonal Matching Pursuit Fall, 211 24 / 42

Iterative Hard Thresholding Sparse approximation algorithm that solves the problem min x y Φx 2 2 +λ x. Iterative algorithm that computes a descent direction given by the gradient of thel 2 -norm. The solution at each iteration is given by finding: x (k+1) = H λ (x (k) +Φ T (y Φx (k) )) H λ is a non-linear operator that only retains the coefficients with the largest magnitude. T. Blumensath and M. Davies. Iterative Hard Thresholding for Compressed Sensing. Journal of Appl. Comp. Harm. Anal. vol. 27, no. 3, pp. 265-274, 29. Compressive Sensing G. Arce Iterative Hard Thresholding Fall, 211 25 / 42

Iterative Hard Thresholding Input and Observation vector y Initialize Measurement Matrix Φ Parameters x () = Iteration k Compute an update: a (k+1) = x (k) +Φ T (y Φx (k) )) Select the largest elements: x (k+1) = H λ (a (k+1) ). Pull other elements toward zero. D. Donoho et. al, SparseLab Version 2.. 27. Compressive Sensing G. Arce Iterative Hard Thresholding Fall, 211 26 / 42

ITH Reconstruction Example.6 5.4.2 4 3 2 1.2.4.6 1 2 3 4.8 5 1 15 2 25 3 reconstruc. signal Iteration 1 5 5 1 15 residue - Iteration 1 1.8.6.4.2.2.4.6.8 1 5 1 15 2 25 3 reconstruc. signal Iteration 5.5.4.3.2.1.1.2.3.4 5 1 15 residue - Iteration 5 Compressive Sensing G. Arce Iterative Hard Thresholding Fall, 211 27 / 42

ITH Reconstruction Example 1 1.8.6.4.2.2.4.6.8.8.6.4.2.2.4.6.8 1 5 1 15 2 25 3 Original Signal 1 5 1 15 2 25 3 reconstruc. signal Iteration 1 8 x 1 4 6 4 2 2 4 6 5 1 15 residue - Iteration 1 Compressive Sensing G. Arce Iterative Hard Thresholding Fall, 211 28 / 42

Weighted Median Regression Reconstruction Algorithm When the random measurements are corrupted by noise the most widely used CS reconstruction algorithms solve the problem min x y Φx 2 2 +τ x 1. whereτ is the regularization parameter that balances the conflicting task of minimizing thel 2 norm while yielding a sparse solution. Asτ increases the solution is sparser. Asτ decreases the solution approximates to least square solution. Compressive Sensing G. Arce Weighted Median Regression Fall, 211 29 / 42

In the solution to the problem min x y Φx 2 2 +τ x 1. Thel 2 term is the data fitting term, induced by the Gaussian assumption of the noise contamination. Thel 1 term is the sparsity promoting term. However, thel 2 term leads to the least square solution which is known to be very sensitive to high-noise level and outliers present in the contamination. Compressive Sensing G. Arce Weighted Median Regression Fall, 211 3 / 42

Example of signal reconstruction when outliers are present in the random projections Compressive Sensing G. Arce Weighted Median Regression Fall, 211 31 / 42

To mitigate the effect of the impulsive noise in the compressive measurements, a more robust norm for the data-fitting term is min y Φx 1 x }{{} +λ x }{{} Data fitting Sparsity (1) LAD offers robustness to a broad class of noise, in particular to heavy tail noise. Optimum under the maximum likelihood principle when the underlying contamination follows a Laplacian-distributed model. J. L. Paredes and G. Arce. Compressive Sensing Signal Reconstruction by Weighted Median Regression Estimate. ICASSP 21. Compressive Sensing G. Arce Weighted Median Regression Fall, 211 32 / 42

Drawbacks: Solvingl -LAD optimization is N p -hard problem. Direct solution is unfeasible even for modest-sized signal. To solve this multivariable minimization problem: Solve a sequence of scalar minimization subproblems. Each subproblem improves the estimate of the solution by minimizing along a selected coordinate with all other coordinates fixed. Closed form solution exists for the scalar minimization problem. Compressive Sensing G. Arce Weighted Median Regression Fall, 211 33 / 42

Signal reconstruction x = min x M (y Φx) i +λ x i=1 The solution using Coordinate Descent Method is given by x n = min x n M N N y i φ i,j x j φ i,n x n +λ x n + λ x j i=1 j=1,j n j=1,j n x n = min x n M φ in y i N j=1,j n φ i,jx j x n +λ x n φ }{{ in }{{} } i=1 Weighted least absolute deviation Q(x n) Regulariz. Compressive Sensing G. Arce Weighted Median Regression Fall, 211 34 / 42

The solution to the minimization problem is given by x n = MEDIAN( φ in y i N At k th iteration: x (k) n = x n = min x n Q(x n )+λ x n j=1,j n φ i,jx j φ in { xn, if Q( x n )+λ Q(), otherwise ) M i=1; n = 1, 2,..., N Compressive Sensing G. Arce Weighted Median Regression Fall, 211 35 / 42

The iterative algorithm detects the non-zero entries of the sparse vector by Robust signal estimation: computing a rough estimate of the signal that minimizes the LAD by the weighted median operator. Basis selection problem: applying a hard-threshold operator on the estimated values. Removing the entry s influence from the observation vector. Repeating these steps again until a performance criterion is reached. Original Signal Iteration Compressive Sensing G. Arce Weighted Median Regression Fall, 211 36 / 42

Input and Observation vector y Initialize Measurement Matrix Φ Parameters Number of Iterations K Iteration counter: k = 1 Hard-thresholding value: λ = λ i Estimation at k = 1, x () = Iteration Step A For the n-th entry { of x, n = 1, 2,..., N compute x n = MEDIAN φ in yi N j=1;j n φijˆxj Step B Step C Output x (k) n = } φ in k i=1 { xn, if Q( x n )+λ Q(), otherwise, Update the hard-thresholding parameter and the estimation of x x (k+1) = x (k) λ = λ i β k Check stopping criterium If k K then set k = k+1 and go to Step A; otherwise, end. Recovered sparse signal ˆx Compressive Sensing G. Arce Weighted Median Regression Fall, 211 37 / 42

Example Signal Reconstruction from Projections in Non-Gaussian Noise CoSaMP L1-Ls Original signal and noisy projections Matching Pursuit Weighted Median Compressive Sensing G. Arce Weighted Median Regression Fall, 211 38 / 42

Performance Comparisons Exact reconstruction for noiseless signals (normalized error is smaller than.1% of the signal energy). Compressive Sensing G. Arce Weighted Median Regression Fall, 211 39 / 42

Performance Comparisons SNR vs. Sparsity for different types of noise Compressive Sensing G. Arce Weighted Median Regression Fall, 211 4 / 42

Convex Relaxation Approach: Thel -LAD problem can be relaxed by solving the following problem min y Φx 1 x }{{} +λ x 1 }{{}. (2) Data fitting Sparsity This problem is convex and can be solved by linear programming. l 1 -norm may not induce the necessary sparsity in the solution. It requires more measurements to achieve a target reconstruction error. Compressive Sensing G. Arce Weighted Median Regression Fall, 211 41 / 42

The solution for the problem x = min x M y i Φ i x +λ x 1 i=1 is given by ˆx = MEDIAN(λ, φ 1 y 1 φ 1,..., φ M y M φ M ). The regularization process induced by thel 1 -term leads to a weighting operation with the zero-valued sample in the WM operation. Large values of the regularization parameter implies large value for the weight corresponding to the zero-valued sample (this favors sparsity). The solution is biased. Compressive Sensing G. Arce Weighted Median Regression Fall, 211 42 / 42