Winter Conference in Statistics Compressed Sensing. LECTURE #1-2 Motivation & geometric insights

Size: px
Start display at page:

Download "Winter Conference in Statistics Compressed Sensing. LECTURE #1-2 Motivation & geometric insights"

Transcription

1 Winter Conference in Statistics 2013 Compressed Sensing LECTURE #1-2 Motivation & geometric insights Prof. Dr. Volkan Cevher LIONS/Laboratory for Information and Inference Systems

2 Major trends 160MP higher resolution / denser sampling x large numbers of sensors x increasing # of modalities / mobility Motivation: solve bigger / more important problems decrease acquisition times / costs entertainment / new consumer products

3 Problems of the current paradigm Sampling at Nyquist rate expensive / difficult Data deluge communications / storage Sample then compress inefficient / impossible / not future proof

4 Recommended for you: A more familiar example Recommender systems observe partial information ratings clicks purchases compatibilities

5 Recommended for you: A more familiar example Recommender systems observe partial information The Netflix problem ratings clicks purchases compatibilities from approx. 100,000,000 ratings predict 3,000,000 ratings movies x users how would you automatically predict?

6 Recommended for you: A more familiar example Recommender systems observe partial information The Netflix problem ratings clicks purchases compatibilities from approx. 100,000,000 ratings predict 3,000,000 ratings movies x users how would you automatically predict? what is it worth?

7 Theoretical set-up Matrix completion for Netflix users movies

8 Theoretical set-up Matrix completion for Netflix users movies Mathematical underpinnings: compressive sensing observations (adversarial) perturbations linear (sampling) operator CS: when we have less samples than the ambient dimension

9 Linear Inverse Problems Myriad applications involve linear dimensionality reduction deconvolution to data mining compression to compressive sensing geophysics to medical imaging [Baraniuk, C, Wakin 2010; Carin et al. 2011]

10 Linear Inverse Problems Challenge:

11 Linear Inverse Problems Deterministic Probabilistic Prior sparsity distribution Metric likelihood/ posterior

12 Deterministic Low-Dimensional Models

13 Sparse representations Sparse signal only K out of N coordinates nonzero support: sorted index

14 Sparse representations Sparse signal only K out of N coordinates nonzero in an appropriate representation Sparse representations sparse transform coefficients Basis representations Wavelets, DCT Frame representations Gabor, curvelets, shearlets Other dictionary representations

15 Sparse representations Sparse signal: only K out of N coordinates nonzero Sparse representations: sparse transform coefficients A fundamental impact:

16 Sparse representations Sparse signal: only K out of N coordinates nonzero Sparse representations: sparse transform coefficients A fundamental impact:

17 Sparse representations Sparse signal: only K out of N coordinates nonzero Sparse representations: sparse transform coefficients A fundamental impact:

18 Sparse representations Sparse signal: only K out of N coordinates nonzero Sparse representations: sparse transform coefficients A fundamental impact: becomes effectively low dimensional* *: If we knew the locations of the coefficients. More on this later.

19 Low-dimensional signal models Information level: pixels large wavelet coefficients (blue = 0) sparse signals low-rank matrices nonlinear models

20 Low-dimensional signal models These lectures Low-dimensional models based on linear representations sparse signals low-rank matrices nonlinear models

21 Linear representation of low-dimensional models A key notion in sparse representation synthesis of the signal using a few vectors A slightly different mathematical formalism for generalization Synthesis model: i.e., linear (positive) combination of elements from an atomic set [Chandrasekaran et al. 2010]

22 Linear representation of low-dimensional models A key notion in sparse representation synthesis of the signal using a few vectors Sparse representations via the atomic formulation Example:

23 Linear representation of low-dimensional models Basic definitions on low-dimensional atomic representations

24 Linear representation of low-dimensional models Basic definitions on low-dimensional atomic representations : convex hull of atoms in A

25 Linear representation of low-dimensional models Basic definitions on low-dimensional atomic representations : convex hull of atoms in A atomic ball

26 Linear representation of low-dimensional models Basic definitions on low-dimensional atomic representations : convex hull of atoms in A : atomic norm* *: requires A to be centrally symmetric

27 Linear representation of low-dimensional models Basic definitions on low-dimensional atomic representations : convex hull of atoms in A : atomic norm* *: requires A to be centrally symmetric

28 Linear representation of low-dimensional models Basic definitions on low-dimensional atomic representations : convex hull of atoms in A : atomic norm* Alternative: *: requires A to be centrally symmetric

29 Linear representation of low-dimensional models Examples with easy forms: sparse vectors low-rank matrices * *symmetric matrices binary vectors

30 Linear representation of low-dimensional models Examples with easy forms: sparse vectors Examples with no-so-easy forms: A : infinite set of unit-norm rank-one tensors low-rank matrices A : finite (but large) set of permutation matrices A : infinite set of orthogonal matrices A : infinite set of matrices constrained by eigenvalues A : infinite set of measures binary vectors A : finite (but large) set of cut matrices [Chandrasekaran et al. 2010]

31 Linear representation of low-dimensional models Pop-quiz: What is?

32 Linear representation of low-dimensional models Pop-quiz: HINT: What is?

33 Linear representation of low-dimensional models Pop-answer: What is?

34 Towards algorithms: a geometric perspective Other key concepts:

35 Towards algorithms: a geometric perspective Other key concepts:

36 Towards algorithms: a geometric perspective Other key concepts: Tangent cone is the set of descent directions where you do not increase the atomic norm.

37 Towards algorithms: a geometric perspective Other key concepts: Tangent cone is the set of descent directions where you do not increase the atomic norm.

38 Towards algorithms: a geometric perspective

39 Towards algorithms: a geometric perspective

40 Towards algorithms: a geometric perspective Consider the criteria:

41 Towards algorithms: a geometric perspective Consider the criteria:

42 Towards algorithms: a geometric perspective Consider the criteria:

43 Towards algorithms: a geometric perspective Consider the criteria:

44 Towards algorithms: a geometric perspective Key observation:

45 Towards algorithms: a geometric perspective How about noise?

46 Towards algorithms: a geometric perspective How about noise? Stability assumption:

47 Towards algorithms: a geometric perspective How about noise? Stability assumption:

48 Towards algorithms: a geometric perspective How about noise? Stability assumption: want epsilon large to minimize overlap between and For this 2D example: Matlab notation

49 Towards algorithms: a geometric perspective How about noise? Stability assumption:

50 Towards algorithms: a geometric perspective Can we guarantee the following?* *without knowing

51 Towards algorithms: a geometric perspective Can we guarantee the following?* YES: with randomized measurements! Gordon s Minimum Restricted Singular Values Theorem has a probabilistic characterization. probabilistic deterministic [Gordon 1988] *without knowing

52 Towards algorithms: a geometric perspective Can we guarantee the following?* Gordon s Minimum Restricted Singular Values Theorem has a probabilistic characterization. Key concept: width of the tangent cone! *without knowing

53 Towards algorithms: a geometric perspective Can we guarantee the following?* Gordon s Minimum Restricted Singular Values Theorem has a probabilistic characterization. *without knowing

54 Towards algorithms: a geometric perspective Can we guarantee the following?* Gordon s Minimum Restricted Singular Values Theorem has a probabilistic characterization. *without knowing

55 Towards algorithms: a geometric perspective Can we guarantee the following?* Gordon s Minimum Restricted Singular Values Theorem has a probabilistic characterization. *without knowing

56 Towards algorithms: a geometric perspective Key observation:

57 Towards algorithms: a geometric perspective How about noise? Stability assumption:

58 Towards algorithms: a geometric perspective Can we guarantee the following?* Gordon s Minimum Restricted Singular Values Theorem has a probabilistic characterization. *without knowing

59 Towards algorithms: a geometric perspective Can we guarantee the following?* *without knowing 1-sparse and 1-random measurement

60 Towards algorithms: a geometric perspective Can we guarantee the following?* *without knowing 1-sparse and 1-random measurement

61 Towards algorithms: a geometric perspective Can we guarantee the following?* *without knowing 1-sparse and 1-random measurement

62 Towards algorithms: a geometric perspective Can we guarantee the following?* A projected 6D hypercube with 64 vertices Blessing-of-dimensionality!

63 Towards algorithms: a geometric perspective Pop-quiz: What is the probability that we can determine a 2-sparse x* with 1-random measurement?

64 Towards algorithms: a geometric perspective Pop-answer: Tangent cone is too wide! Need at least 2 measurements!

65 Take home messages [Chandrasekaran et al. 2010] convex polytope <> atomic norm geometry (and algebra) of representations in high dimensions geometric perspective <> convex criteria convex optimization algorithms in high dimensions tangent cone width <> # of randomized samples probabilistic concentration-of-measures in high dimensions

66 Back to the initial example Matrix completion for Netflix movies x users users movies What is low-rank?

67 Back to the initial example Matrix completion for Netflix movies x users users movies What does the simple low-rank assumption buy? quite a lot of extrapolation power!

68

69 three and do this fast with theoretical guarantees

70 Sampling/sketching design scene single photon detector +Coding theory +Theoretical computer science +Learning theory +Databases random pattern on DMD array DMD DMD image reconstruction or processing Structured random matrices 1-bit CS expanders & extractors

71 Structured recovery Sparsity +Theoretical computer science +Learning theory +Optimization +Databases sorted index Sparse vector only K out of N coordinates nonzero

72 Structured recovery Sparsity +Theoretical computer science +Learning theory +Optimization +Databases sorted index Structured sparse vector only certain K out of N coordinates nonzero

73 Structured recovery Structured sparsity +Theoretical computer science +Learning theory +Optimization +Databases + requires smaller sketches + enhanced recovery + faster recovery support of the solution <> modular approximation problem integer linear program matroid structured sparse models clustered /diversified sparsity models tightly connected with max-cover, binpacking, knapsack problems Recovery with low-dimensional models, including low-rank

74 Quantum tomography Quantum state estimation +Theoretical computer science +Databases +Information theory +Optimization 1 a state of n possibly-entangled qubits takes ~2 n bits to specify, even approximately 0 Recovery with rank and trace constraints with M=O(N) 1. Create Pauli measurements (semi-random) 2. Estimate Tr(Φ i ρ) for each 1 i M 3. Find any hypothesis state σ st Tr(Φ i σ) Tr(Φ i ρ) for all 1 i M Huge dimensional problem! (desperately) need scalable algorithms also need theory for perfect density estimation

75 Learning theory and methods +Learning theory +Optimization A fundamental problem: +Information theory +Theoretical computer science given learn a mapping Our interest <> non-parametric functions graphs (e.g., social networks) dictionary learning Rigorous foundations <> sample complexity approximation guarantees tractability Key tools <> sparsity/low-rankness submodularity smoothness

76 Compressible priors +Learning theory +Statistics +Information theory Goal: seek distributions whose iid realizations can be well-approximated as sparse Definition: relative k-term approximation: sorted index

77 Compressible priors +Learning theory +Information theory Goal: seek distributions whose iid realizations can be well-approximated as sparse Classical: New: sorted index

78 Compressible priors +Learning theory +Statistics +Information theory Goal: seek distributions whose iid realizations can be well-approximated as sparse Motivations: deterministic embedding scaffold for the probabilistic view analytical proxies for sparse signals learning (e.g., dim. reduced data) algorithms (e.g., structured sparse) information theoretic (e.g., coding) lots of applications in vision, image understanding / analysis

79 References M. Afonso, J. Bioucas-Dias, M. Figueiredo, Fast image recovery using variable splitting and constrained optimization, IEEE Transactions on Image Processing, vol. 19, H. Attouch, J. Bolte, P. Redont, A. Soubeyran, Proximal alternating minimization and projection methods for nonconvex problems: an approach based on the Kurdyka-Lojasiewicz inequality, Math. Oper. Research, 2010 O. Axelsson, Iterative Solution Methods, Cambridge University Press, R. Baraniuk, V. Cevher, M. Duarte, C. Hegde, Model-based compressive sensing, IEEE Transactions on Information Theory, vol. 56, R. Baraniuk, M. Davenport, R. de Vore, M. Wakin, A Simple Proof of the Restricted Isometry Property for Random Matrices, Constructive Approximation, R. Baraniuk, V. Cevher, M. Wakin, Low-dimensional models for dimensionality reduction and signal recovery: A geometric perspective, Proceedings of the IEEE, vol. 98, J. Barzilai and J. Borwein, Two point step size gradient methods, IMA Journal of Numer. Anal., vol. 8, R. Basri, D. Jacobs, Lambertian Reflectance and Linear Subspaces, IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 25, A. Beck, M. Teboulle, A fast iterative shrinkage-thresholding algorithm for linear inverse problems, SIAM Journal on Imaging Science, vol. 2, J. Bioucas-Dias, M. Figueiredo, A new TwIST: Two-step iterative shrinkage/thresholding algorithms for image restoration, IEEE Transactions on Image Processing, vol. 16, P. Boufounos, R. Baraniuk, 1-bit compressed sensing, Proceedings of the Conference on Information Science and Systems, Princeton, C. Boutsidis, M. Mahoney, P. Drineas, An improved approximation algorithm for the column subset selection problem, Proc. 20th Annual ACM/SIAM Symposium on Discrete Algorithms, New York, NY, 2008.

80 References P. Bühlmann, S. van der Geer, Statistics for High-Dimensional Data, Springer, E. Candès, The restricted isometry property and its implications for compressed sensing, Comptes Rendus Mathematique, vol. 346, E. Candès and B. Recht, Exact matrix completion via convex optimization, Foundations of Computational Mathematics, vol. 9, L. Carin, R. Baraniuk, V. Cevher, D. Dunson, M. Jordan, G. Sapiro, M. Wakin, Learning low-dimensional signal models, IEEE Signal Processing Magazine, vol. 28, V. Cevher, An ALPS view of sparse recovery, Proc. ICASSP, V. Chandrasekaran, B. Recht, P. Parrilo, A. Willsky, The convex geometry of linear inverse problems, submitted, R. Chartrand, W. Yin, Iteratively reweighted algorithms for compressive sensing, Proc. ICASSP, 2008 S. Chen, D. Donoho, M. Saunders, Atomic decomposition by Basis Pursuit, SIAM Review, vol. 43, P. Combettes, V. Wajs, Signal recovery by proximal forward-backward splitting, SIAM Journal Multiscale Modeling and Simulation, vol. 4, J. Eckstein, D. Bertsekas, On the Douglas-Rachford splitting method and the proximal point algorithm for maximal monotone operators, Mathematical Programming, vol. 5, M. Figueiredo and J. Bioucas-Dias, Restoration of Poissonian images using alternating direction optimization, IEEE Transactions on Image Processing, vol. 19, M. Figueiredo, R. Nowak, S. Wright, Gradient projection for sparse reconstruction: application to compressed sensing and other inverse problems, IEEE Journal of Selected Topics in Signal Processing, vol. 1, 2007.

81 References D. Gabay, B. Mercier, A dual algorithm for the solution of nonlinear variational problems via finite-element approximations, Computers and Mathematics with Application, vol. 2, R. Glowinski, A. Marroco, Sur l approximation, par elements finis d ordre un, et la resolution, par penalisation-dualité d une classe de problemes de Dirichlet non lineares, Rev. Française d Automatique, Y. Gordon, On Milman's inequality and random subspaces which escape through a mesh in R n, in Geometric Aspects of Functional Analysis, Springer, E. Hale, W. Yin, Y. Zhang, Fixed-point continuation for l1-minimization: Methodology and convergence, SIAM Journal on Optimization, vol. 19, N. Halko, P.-G. Martinsson, J. Tropp, Finding structure with randomness: stochastic algorithms for constructing approximate matrix decompositions, SIAM Review, vol. 53, C. Hegde, M. Duarte, V. Cevher, Compressive sensing recovery of spike trains using a structured sparsity model, Proceedings of SPARS'09, Saint-Malo, France, M. Hestenes, Multiplier and gradient methods, Journal of Optimazion Theory andapplications, vol. 4, A. Kyrillidis, V. Cevher, Recipes for hard thresholding methods, Tech. Rep., EPFL, J. Lee, V. Mirrokni, V. Nagarajan, M. Sviridenko, Non-monotone submodular maximization under matroid and knapsack constraints, Proc. 41st Annual ACM Symposium on Theory of Computing, Bethesda, MD, A. Lewis, J. Malick, Alternating projections on manifolds, Math. of Operations Research, vol. 33, D. Lorenz, Constructing test instances for basis pursuit denoising, submitted, N. Meinshausen, P. Bühlmann, High-dimensional graphs and variable selection with the lasso, The Annals of Statistics, vol. 34, pp , J.-J. Moreau, Proximité et dualité dans un espace hilbertien, Bull. Soc. Mathematiques de France, vol. 93, 1965.

82 References G. Nemhauser, L. Wolsey, Integer and combinatorial optimization, Wiley, S. Osher, M. Burger, D. Goldfarb, J. Xu, W. Yin, An iterative regularization method for total variation-based image restoration, SIAM Journal on Multiscale Modeling and Simulation, vol. 4, Y. Plan, Compressed sensing, sparse approximation, low-rank matrix estimation, PhD Thesis, Caltech, 2011 M. Powell, A method for nonlinear constraints in minimization problems, in Optimization, Academic Press, H. Raguet, J. Fadili, G. Peyré, Generalized Forward-Backward splitting, Tech. report, Hal , S. Setzer, G. Steidl, T. Teuber, Deblurring Poissonian images by split Bregman techniques, Journal of Visual Communication and Image Representation, K.-C. Toh, S. Yun, An accelerated proximal gradient algorithm for nuclear norm regularized least squares Problems, Pacific Journal of Optimization, vol. 6, A. Waters, A. Sankaranarayanan, R. Baraniuk, SpaRCS: recovering low-rank and sparse matrices from compressive measurements, Neural Information Processing Systems, S. Wright, R. Nowak, M. Figueiredo, Sparse reconstruction by separable approximation, IEEE Transactions on Signal Processing, vol. 57, W. Yin, S. Osher, D. Goldfarb, J. Darbon, Bregman iterative algorithms for l1-minimization with applications to compressed sensing, SIAM Journal on Imaging Science, vol. 1, T. Zhou, D. Tao, Godec: randomized low-rank & sparse matrix decomposition in noisy case, Proc. International Conference on Machine Learning, Bellevue, WA, 2011.

Convex and non-convex approaches for low-dimensional models

Convex and non-convex approaches for low-dimensional models Convex and non-convex approaches for low-dimensional models Volkan Cevher Laboratory for Information and Inference Systems (LIONS) École Polytechnique Fédérale de Lausanne (EPFL) Switzerland http://lions.epfl.ch

More information

Combinatorial Selection and Least Absolute Shrinkage via The CLASH Operator

Combinatorial Selection and Least Absolute Shrinkage via The CLASH Operator Combinatorial Selection and Least Absolute Shrinkage via The CLASH Operator Volkan Cevher Laboratory for Information and Inference Systems LIONS / EPFL http://lions.epfl.ch & Idiap Research Institute joint

More information

Compressive Sensing. Part IV: Beyond Sparsity. Mark A. Davenport. Stanford University Department of Statistics

Compressive Sensing. Part IV: Beyond Sparsity. Mark A. Davenport. Stanford University Department of Statistics Compressive Sensing Part IV: Beyond Sparsity Mark A. Davenport Stanford University Department of Statistics Beyond Sparsity Not all signal models fit neatly into the sparse setting The concept of dimension

More information

Iterative Shrinkage/Thresholding g Algorithms: Some History and Recent Development

Iterative Shrinkage/Thresholding g Algorithms: Some History and Recent Development Iterative Shrinkage/Thresholding g Algorithms: Some History and Recent Development Mário A. T. Figueiredo Instituto de Telecomunicações and Instituto Superior Técnico, Technical University of Lisbon PORTUGAL

More information

The Fundamentals of Compressive Sensing

The Fundamentals of Compressive Sensing The Fundamentals of Compressive Sensing Mark A. Davenport Georgia Institute of Technology School of Electrical and Computer Engineering Sensor explosion Data deluge Digital revolution If we sample a signal

More information

Compressive. Graphical Models. Volkan Cevher. Rice University ELEC 633 / STAT 631 Class

Compressive. Graphical Models. Volkan Cevher. Rice University ELEC 633 / STAT 631 Class Compressive Sensing and Graphical Models Volkan Cevher volkan@rice edu volkan@rice.edu Rice University ELEC 633 / STAT 631 Class http://www.ece.rice.edu/~vc3/elec633/ Digital Revolution Pressure is on

More information

Structurally Random Matrices

Structurally Random Matrices Fast Compressive Sampling Using Structurally Random Matrices Presented by: Thong Do (thongdo@jhu.edu) The Johns Hopkins University A joint work with Prof. Trac Tran, The Johns Hopkins University it Dr.

More information

The convex geometry of inverse problems

The convex geometry of inverse problems The convex geometry of inverse problems Benjamin Recht Department of Computer Sciences University of Wisconsin-Madison Joint work with Venkat Chandrasekaran Pablo Parrilo Alan Willsky Linear Inverse Problems

More information

The Benefit of Tree Sparsity in Accelerated MRI

The Benefit of Tree Sparsity in Accelerated MRI The Benefit of Tree Sparsity in Accelerated MRI Chen Chen and Junzhou Huang Department of Computer Science and Engineering, The University of Texas at Arlington, TX, USA 76019 Abstract. The wavelet coefficients

More information

Total Variation Denoising with Overlapping Group Sparsity

Total Variation Denoising with Overlapping Group Sparsity 1 Total Variation Denoising with Overlapping Group Sparsity Ivan W. Selesnick and Po-Yu Chen Polytechnic Institute of New York University Brooklyn, New York selesi@poly.edu 2 Abstract This paper describes

More information

Augmented Lagrangian Methods

Augmented Lagrangian Methods Augmented Lagrangian Methods Stephen J. Wright 1 2 Computer Sciences Department, University of Wisconsin-Madison. IMA, August 2016 Stephen Wright (UW-Madison) Augmented Lagrangian IMA, August 2016 1 /

More information

PIPA: A New Proximal Interior Point Algorithm for Large-Scale Convex Optimization

PIPA: A New Proximal Interior Point Algorithm for Large-Scale Convex Optimization PIPA: A New Proximal Interior Point Algorithm for Large-Scale Convex Optimization Marie-Caroline Corbineau 1, Emilie Chouzenoux 1,2, Jean-Christophe Pesquet 1 1 CVN, CentraleSupélec, Université Paris-Saclay,

More information

Augmented Lagrangian Methods

Augmented Lagrangian Methods Augmented Lagrangian Methods Mário A. T. Figueiredo 1 and Stephen J. Wright 2 1 Instituto de Telecomunicações, Instituto Superior Técnico, Lisboa, Portugal 2 Computer Sciences Department, University of

More information

Compressed Sensing and Applications by using Dictionaries in Image Processing

Compressed Sensing and Applications by using Dictionaries in Image Processing Advances in Computational Sciences and Technology ISSN 0973-6107 Volume 10, Number 2 (2017) pp. 165-170 Research India Publications http://www.ripublication.com Compressed Sensing and Applications by using

More information

Compressed Sensing and L 1 -Related Minimization

Compressed Sensing and L 1 -Related Minimization Compressed Sensing and L 1 -Related Minimization Yin Wotao Computational and Applied Mathematics Rice University Jan 4, 2008 Chinese Academy of Sciences Inst. Comp. Math The Problems of Interest Unconstrained

More information

Sparsity and image processing

Sparsity and image processing Sparsity and image processing Aurélie Boisbunon INRIA-SAM, AYIN March 6, Why sparsity? Main advantages Dimensionality reduction Fast computation Better interpretability Image processing pattern recognition

More information

Signal Processing with Side Information

Signal Processing with Side Information Signal Processing with Side Information A Geometric Approach via Sparsity João F. C. Mota Heriot-Watt University, Edinburgh, UK Side Information Signal processing tasks Denoising Reconstruction Demixing

More information

ECE 8201: Low-dimensional Signal Models for High-dimensional Data Analysis

ECE 8201: Low-dimensional Signal Models for High-dimensional Data Analysis ECE 8201: Low-dimensional Signal Models for High-dimensional Data Analysis Yuejie Chi Departments of ECE and BMI The Ohio State University September 24, 2015 Time, location, and office hours Time: Tue/Thu

More information

Compressive Sensing: Theory and Practice

Compressive Sensing: Theory and Practice Compressive Sensing: Theory and Practice Mark Davenport Rice University ECE Department Sensor Explosion Digital Revolution If we sample a signal at twice its highest frequency, then we can recover it exactly.

More information

The convex geometry of inverse problems

The convex geometry of inverse problems The convex geometry of inverse problems Benjamin Recht Department of Computer Sciences University of Wisconsin-Madison Joint work with Venkat Chandrasekaran Pablo Parrilo Alan Willsky Linear Inverse Problems

More information

A More Efficient Approach to Large Scale Matrix Completion Problems

A More Efficient Approach to Large Scale Matrix Completion Problems A More Efficient Approach to Large Scale Matrix Completion Problems Matthew Olson August 25, 2014 Abstract This paper investigates a scalable optimization procedure to the low-rank matrix completion problem

More information

COMPRESSIVE VIDEO SAMPLING

COMPRESSIVE VIDEO SAMPLING COMPRESSIVE VIDEO SAMPLING Vladimir Stanković and Lina Stanković Dept of Electronic and Electrical Engineering University of Strathclyde, Glasgow, UK phone: +44-141-548-2679 email: {vladimir,lina}.stankovic@eee.strath.ac.uk

More information

An incomplete survey of matrix completion. Benjamin Recht Department of Computer Sciences University of Wisconsin-Madison

An incomplete survey of matrix completion. Benjamin Recht Department of Computer Sciences University of Wisconsin-Madison An incomplete survey of matrix completion Benjamin Recht Department of Computer Sciences University of Wisconsin-Madison Abstract Setup: Matrix Completion M = M ij known for black cells M ij unknown for

More information

Image Restoration with Compound Regularization Using a Bregman Iterative Algorithm

Image Restoration with Compound Regularization Using a Bregman Iterative Algorithm Image Restoration with Compound Regularization Using a Bregman Iterative Algorithm Manya V. Afonso, José M. Bioucas-Dias, Mario A. T. Figueiredo To cite this version: Manya V. Afonso, José M. Bioucas-Dias,

More information

Proximal operator and methods

Proximal operator and methods Proximal operator and methods Master 2 Data Science, Univ. Paris Saclay Robert M. Gower Optimization Sum of Terms A Datum Function Finite Sum Training Problem The Training Problem Convergence GD I Theorem

More information

Compressive Parameter Estimation with Earth Mover s Distance via K-means Clustering. Dian Mo and Marco F. Duarte

Compressive Parameter Estimation with Earth Mover s Distance via K-means Clustering. Dian Mo and Marco F. Duarte Compressive Parameter Estimation with Earth Mover s Distance via K-means Clustering Dian Mo and Marco F. Duarte Compressive Sensing (CS) Integrates linear acquisition with dimensionality reduction linear

More information

Non-Differentiable Image Manifolds

Non-Differentiable Image Manifolds The Multiscale Structure of Non-Differentiable Image Manifolds Michael Wakin Electrical l Engineering i Colorado School of Mines Joint work with Richard Baraniuk, Hyeokho Choi, David Donoho Models for

More information

Composite Self-concordant Minimization

Composite Self-concordant Minimization Composite Self-concordant Minimization Volkan Cevher Laboratory for Information and Inference Systems-LIONS Ecole Polytechnique Federale de Lausanne (EPFL) volkan.cevher@epfl.ch Paris 6 Dec 11, 2013 joint

More information

Convex optimization algorithms for sparse and low-rank representations

Convex optimization algorithms for sparse and low-rank representations Convex optimization algorithms for sparse and low-rank representations Lieven Vandenberghe, Hsiao-Han Chao (UCLA) ECC 2013 Tutorial Session Sparse and low-rank representation methods in control, estimation,

More information

P257 Transform-domain Sparsity Regularization in Reconstruction of Channelized Facies

P257 Transform-domain Sparsity Regularization in Reconstruction of Channelized Facies P257 Transform-domain Sparsity Regularization in Reconstruction of Channelized Facies. azemi* (University of Alberta) & H.R. Siahkoohi (University of Tehran) SUMMARY Petrophysical reservoir properties,

More information

Image Restoration using Accelerated Proximal Gradient method

Image Restoration using Accelerated Proximal Gradient method Image Restoration using Accelerated Proximal Gradient method Alluri.Samuyl Department of computer science, KIET Engineering College. D.Srinivas Asso.prof, Department of computer science, KIET Engineering

More information

An R Package flare for High Dimensional Linear Regression and Precision Matrix Estimation

An R Package flare for High Dimensional Linear Regression and Precision Matrix Estimation An R Package flare for High Dimensional Linear Regression and Precision Matrix Estimation Xingguo Li Tuo Zhao Xiaoming Yuan Han Liu Abstract This paper describes an R package named flare, which implements

More information

Sparsity Based Regularization

Sparsity Based Regularization 9.520: Statistical Learning Theory and Applications March 8th, 200 Sparsity Based Regularization Lecturer: Lorenzo Rosasco Scribe: Ioannis Gkioulekas Introduction In previous lectures, we saw how regularization

More information

Modified Iterative Method for Recovery of Sparse Multiple Measurement Problems

Modified Iterative Method for Recovery of Sparse Multiple Measurement Problems Journal of Electrical Engineering 6 (2018) 124-128 doi: 10.17265/2328-2223/2018.02.009 D DAVID PUBLISHING Modified Iterative Method for Recovery of Sparse Multiple Measurement Problems Sina Mortazavi and

More information

SPARSE COMPONENT ANALYSIS FOR BLIND SOURCE SEPARATION WITH LESS SENSORS THAN SOURCES. Yuanqing Li, Andrzej Cichocki and Shun-ichi Amari

SPARSE COMPONENT ANALYSIS FOR BLIND SOURCE SEPARATION WITH LESS SENSORS THAN SOURCES. Yuanqing Li, Andrzej Cichocki and Shun-ichi Amari SPARSE COMPONENT ANALYSIS FOR BLIND SOURCE SEPARATION WITH LESS SENSORS THAN SOURCES Yuanqing Li, Andrzej Cichocki and Shun-ichi Amari Laboratory for Advanced Brain Signal Processing Laboratory for Mathematical

More information

IE598 Big Data Optimization Summary Nonconvex Optimization

IE598 Big Data Optimization Summary Nonconvex Optimization IE598 Big Data Optimization Summary Nonconvex Optimization Instructor: Niao He April 16, 2018 1 This Course Big Data Optimization Explore modern optimization theories, algorithms, and big data applications

More information

COMPRESSED sensing is a new sampling strategy that

COMPRESSED sensing is a new sampling strategy that IEEE TRANSACTIONS ON SIGNAL PROCESSING, VOL. 58, NO. 5, MAY 2010 2613 Best Basis Compressed Sensing Gabriel Peyré, Member, IEEE Abstract This paper proposes a best basis extension of compressed sensing

More information

ADAPTIVE LOW RANK AND SPARSE DECOMPOSITION OF VIDEO USING COMPRESSIVE SENSING

ADAPTIVE LOW RANK AND SPARSE DECOMPOSITION OF VIDEO USING COMPRESSIVE SENSING ADAPTIVE LOW RANK AND SPARSE DECOMPOSITION OF VIDEO USING COMPRESSIVE SENSING Fei Yang 1 Hong Jiang 2 Zuowei Shen 3 Wei Deng 4 Dimitris Metaxas 1 1 Rutgers University 2 Bell Labs 3 National University

More information

Off-the-Grid Compressive Imaging: Recovery of Piecewise Constant Images from Few Fourier Samples

Off-the-Grid Compressive Imaging: Recovery of Piecewise Constant Images from Few Fourier Samples Off-the-Grid Compressive Imaging: Recovery of Piecewise Constant Images from Few Fourier Samples Greg Ongie PhD Candidate Department of Applied Math and Computational Sciences University of Iowa April

More information

Lecture 17 Sparse Convex Optimization

Lecture 17 Sparse Convex Optimization Lecture 17 Sparse Convex Optimization Compressed sensing A short introduction to Compressed Sensing An imaging perspective 10 Mega Pixels Scene Image compression Picture Why do we compress images? Introduction

More information

Advanced phase retrieval: maximum likelihood technique with sparse regularization of phase and amplitude

Advanced phase retrieval: maximum likelihood technique with sparse regularization of phase and amplitude Advanced phase retrieval: maximum likelihood technique with sparse regularization of phase and amplitude A. Migukin *, V. atkovnik and J. Astola Department of Signal Processing, Tampere University of Technology,

More information

1. Introduction. performance of numerical methods. complexity bounds. structural convex optimization. course goals and topics

1. Introduction. performance of numerical methods. complexity bounds. structural convex optimization. course goals and topics 1. Introduction EE 546, Univ of Washington, Spring 2016 performance of numerical methods complexity bounds structural convex optimization course goals and topics 1 1 Some course info Welcome to EE 546!

More information

SEQUENTIAL IMAGE COMPLETION FOR HIGH-SPEED LARGE-PIXEL NUMBER SENSING

SEQUENTIAL IMAGE COMPLETION FOR HIGH-SPEED LARGE-PIXEL NUMBER SENSING SEQUENTIAL IMAGE COMPLETION FOR HIGH-SPEED LARGE-PIXEL NUMBER SENSING Akira Hirabayashi Naoki Nogami Takashi Ijiri Laurent Condat Ritsumeikan University College Info. Science & Eng. Kusatsu, Shiga 525-8577,

More information

Compressive Sensing for High-Dimensional Data

Compressive Sensing for High-Dimensional Data Compressive Sensing for High-Dimensional Data Richard Baraniuk Rice University dsp.rice.edu/cs DIMACS Workshop on Recent Advances in Mathematics and Information Sciences for Analysis and Understanding

More information

Detecting Burnscar from Hyperspectral Imagery via Sparse Representation with Low-Rank Interference

Detecting Burnscar from Hyperspectral Imagery via Sparse Representation with Low-Rank Interference Detecting Burnscar from Hyperspectral Imagery via Sparse Representation with Low-Rank Interference Minh Dao 1, Xiang Xiang 1, Bulent Ayhan 2, Chiman Kwan 2, Trac D. Tran 1 Johns Hopkins Univeristy, 3400

More information

Robust Principal Component Analysis (RPCA)

Robust Principal Component Analysis (RPCA) Robust Principal Component Analysis (RPCA) & Matrix decomposition: into low-rank and sparse components Zhenfang Hu 2010.4.1 reference [1] Chandrasekharan, V., Sanghavi, S., Parillo, P., Wilsky, A.: Ranksparsity

More information

A fast algorithm for sparse reconstruction based on shrinkage, subspace optimization and continuation [Wen,Yin,Goldfarb,Zhang 2009]

A fast algorithm for sparse reconstruction based on shrinkage, subspace optimization and continuation [Wen,Yin,Goldfarb,Zhang 2009] A fast algorithm for sparse reconstruction based on shrinkage, subspace optimization and continuation [Wen,Yin,Goldfarb,Zhang 2009] Yongjia Song University of Wisconsin-Madison April 22, 2010 Yongjia Song

More information

Compressive Sensing for Multimedia. Communications in Wireless Sensor Networks

Compressive Sensing for Multimedia. Communications in Wireless Sensor Networks Compressive Sensing for Multimedia 1 Communications in Wireless Sensor Networks Wael Barakat & Rabih Saliba MDDSP Project Final Report Prof. Brian L. Evans May 9, 2008 Abstract Compressive Sensing is an

More information

Measurements and Bits: Compressed Sensing meets Information Theory. Dror Baron ECE Department Rice University dsp.rice.edu/cs

Measurements and Bits: Compressed Sensing meets Information Theory. Dror Baron ECE Department Rice University dsp.rice.edu/cs Measurements and Bits: Compressed Sensing meets Information Theory Dror Baron ECE Department Rice University dsp.rice.edu/cs Sensing by Sampling Sample data at Nyquist rate Compress data using model (e.g.,

More information

ELEG Compressive Sensing and Sparse Signal Representations

ELEG Compressive Sensing and Sparse Signal Representations ELEG 867 - Compressive Sensing and Sparse Signal Representations Gonzalo R. Arce Depart. of Electrical and Computer Engineering University of Delaware Fall 211 Compressive Sensing G. Arce Fall, 211 1 /

More information

Efficient MR Image Reconstruction for Compressed MR Imaging

Efficient MR Image Reconstruction for Compressed MR Imaging Efficient MR Image Reconstruction for Compressed MR Imaging Junzhou Huang, Shaoting Zhang, and Dimitris Metaxas Division of Computer and Information Sciences, Rutgers University, NJ, USA 08854 Abstract.

More information

Compressive Sensing based image processing in TrapView pest monitoring system

Compressive Sensing based image processing in TrapView pest monitoring system Compressive Sensing based image processing in TrapView pest monitoring system Milan Marić *, Irena Orović ** and Srdjan Stanković ** * S&T Crna Gora d.o.o, Podgorica, Montenegro ** University of Montenegro,

More information

CPSC 340: Machine Learning and Data Mining. Principal Component Analysis Fall 2016

CPSC 340: Machine Learning and Data Mining. Principal Component Analysis Fall 2016 CPSC 340: Machine Learning and Data Mining Principal Component Analysis Fall 2016 A2/Midterm: Admin Grades/solutions will be posted after class. Assignment 4: Posted, due November 14. Extra office hours:

More information

Deconvolution with curvelet-domain sparsity Vishal Kumar, EOS-UBC and Felix J. Herrmann, EOS-UBC

Deconvolution with curvelet-domain sparsity Vishal Kumar, EOS-UBC and Felix J. Herrmann, EOS-UBC Deconvolution with curvelet-domain sparsity Vishal Kumar, EOS-UBC and Felix J. Herrmann, EOS-UBC SUMMARY We use the recently introduced multiscale and multidirectional curvelet transform to exploit the

More information

Parameter Estimation in Compressive Sensing. Marco F. Duarte

Parameter Estimation in Compressive Sensing. Marco F. Duarte Parameter Estimation in Compressive Sensing Marco F. Duarte Systems of Lines: Application of Algebraic Combinatorics Worcester Polytechnic Institute, August 13 2015 Concise Signal Structure Sparse signal:

More information

Algebraic Iterative Methods for Computed Tomography

Algebraic Iterative Methods for Computed Tomography Algebraic Iterative Methods for Computed Tomography Per Christian Hansen DTU Compute Department of Applied Mathematics and Computer Science Technical University of Denmark Per Christian Hansen Algebraic

More information

ADVANCED IMAGE PROCESSING METHODS FOR ULTRASONIC NDE RESEARCH C. H. Chen, University of Massachusetts Dartmouth, N.

ADVANCED IMAGE PROCESSING METHODS FOR ULTRASONIC NDE RESEARCH C. H. Chen, University of Massachusetts Dartmouth, N. ADVANCED IMAGE PROCESSING METHODS FOR ULTRASONIC NDE RESEARCH C. H. Chen, University of Massachusetts Dartmouth, N. Dartmouth, MA USA Abstract: The significant progress in ultrasonic NDE systems has now

More information

Sparse Component Analysis (SCA) in Random-valued and Salt and Pepper Noise Removal

Sparse Component Analysis (SCA) in Random-valued and Salt and Pepper Noise Removal Sparse Component Analysis (SCA) in Random-valued and Salt and Pepper Noise Removal Hadi. Zayyani, Seyyedmajid. Valliollahzadeh Sharif University of Technology zayyani000@yahoo.com, valliollahzadeh@yahoo.com

More information

Sparse wavelet expansions for seismic tomography: Methods and algorithms

Sparse wavelet expansions for seismic tomography: Methods and algorithms Sparse wavelet expansions for seismic tomography: Methods and algorithms Ignace Loris Université Libre de Bruxelles International symposium on geophysical imaging with localized waves 24 28 July 2011 (Joint

More information

ACCELERATED DUAL GRADIENT-BASED METHODS FOR TOTAL VARIATION IMAGE DENOISING/DEBLURRING PROBLEMS. Donghwan Kim and Jeffrey A.

ACCELERATED DUAL GRADIENT-BASED METHODS FOR TOTAL VARIATION IMAGE DENOISING/DEBLURRING PROBLEMS. Donghwan Kim and Jeffrey A. ACCELERATED DUAL GRADIENT-BASED METHODS FOR TOTAL VARIATION IMAGE DENOISING/DEBLURRING PROBLEMS Donghwan Kim and Jeffrey A. Fessler University of Michigan Dept. of Electrical Engineering and Computer Science

More information

An M-Channel Critically Sampled Graph Filter Bank

An M-Channel Critically Sampled Graph Filter Bank An M-Channel Critically Sampled Graph Filter Bank Yan Jin and David Shuman March 7, 2017 ICASSP, New Orleans, LA Special thanks and acknowledgement: Andre Archer, Andrew Bernoff, Andrew Beveridge, Stefan

More information

Sparse Models in Image Understanding And Computer Vision

Sparse Models in Image Understanding And Computer Vision Sparse Models in Image Understanding And Computer Vision Jayaraman J. Thiagarajan Arizona State University Collaborators Prof. Andreas Spanias Karthikeyan Natesan Ramamurthy Sparsity Sparsity of a vector

More information

Mining Sparse Representations: Theory, Algorithms, and Applications

Mining Sparse Representations: Theory, Algorithms, and Applications Mining Sparse Representations: Theory, Algorithms, and Applications Jun Liu, Shuiwang Ji, and Jieping Ye Computer Science and Engineering The Biodesign Institute Arizona State University What is Sparsity?

More information

Limitations of Matrix Completion via Trace Norm Minimization

Limitations of Matrix Completion via Trace Norm Minimization Limitations of Matrix Completion via Trace Norm Minimization ABSTRACT Xiaoxiao Shi Computer Science Department University of Illinois at Chicago xiaoxiao@cs.uic.edu In recent years, compressive sensing

More information

Outlier Pursuit: Robust PCA and Collaborative Filtering

Outlier Pursuit: Robust PCA and Collaborative Filtering Outlier Pursuit: Robust PCA and Collaborative Filtering Huan Xu Dept. of Mechanical Engineering & Dept. of Mathematics National University of Singapore Joint w/ Constantine Caramanis, Yudong Chen, Sujay

More information

Bilevel Sparse Coding

Bilevel Sparse Coding Adobe Research 345 Park Ave, San Jose, CA Mar 15, 2013 Outline 1 2 The learning model The learning algorithm 3 4 Sparse Modeling Many types of sensory data, e.g., images and audio, are in high-dimensional

More information

Contents. I Basics 1. Copyright by SIAM. Unauthorized reproduction of this article is prohibited.

Contents. I Basics 1. Copyright by SIAM. Unauthorized reproduction of this article is prohibited. page v Preface xiii I Basics 1 1 Optimization Models 3 1.1 Introduction... 3 1.2 Optimization: An Informal Introduction... 4 1.3 Linear Equations... 7 1.4 Linear Optimization... 10 Exercises... 12 1.5

More information

Compressive Sensing MRI with Wavelet Tree Sparsity

Compressive Sensing MRI with Wavelet Tree Sparsity Compressive Sensing MRI with Wavelet Tree Sparsity Chen Chen and Junzhou Huang Department of Computer Science and Engineering University of Texas at Arlington cchen@mavs.uta.edu jzhuang@uta.edu Abstract

More information

Signal Reconstruction from Sparse Representations: An Introdu. Sensing

Signal Reconstruction from Sparse Representations: An Introdu. Sensing Signal Reconstruction from Sparse Representations: An Introduction to Compressed Sensing December 18, 2009 Digital Data Acquisition Suppose we want to acquire some real world signal digitally. Applications

More information

A Geometric Analysis of Subspace Clustering with Outliers

A Geometric Analysis of Subspace Clustering with Outliers A Geometric Analysis of Subspace Clustering with Outliers Mahdi Soltanolkotabi and Emmanuel Candés Stanford University Fundamental Tool in Data Mining : PCA Fundamental Tool in Data Mining : PCA Subspace

More information

I How does the formulation (5) serve the purpose of the composite parameterization

I How does the formulation (5) serve the purpose of the composite parameterization Supplemental Material to Identifying Alzheimer s Disease-Related Brain Regions from Multi-Modality Neuroimaging Data using Sparse Composite Linear Discrimination Analysis I How does the formulation (5)

More information

Robust Principal Component Analysis: Exact Recovery of Corrupted Low-Rank Matrices by Convex Optimization

Robust Principal Component Analysis: Exact Recovery of Corrupted Low-Rank Matrices by Convex Optimization Robust Principal Component Analysis: Exact Recovery of Corrupted Low-Rank Matrices by Convex Optimization John Wright, Yigang Peng, Yi Ma Visual Computing Group Microsoft Research Asia {jowrig,v-yipe,mayi}@microsoft.com

More information

A Novel Image Super-resolution Reconstruction Algorithm based on Modified Sparse Representation

A Novel Image Super-resolution Reconstruction Algorithm based on Modified Sparse Representation , pp.162-167 http://dx.doi.org/10.14257/astl.2016.138.33 A Novel Image Super-resolution Reconstruction Algorithm based on Modified Sparse Representation Liqiang Hu, Chaofeng He Shijiazhuang Tiedao University,

More information

Compressive Sensing Based Image Reconstruction using Wavelet Transform

Compressive Sensing Based Image Reconstruction using Wavelet Transform Compressive Sensing Based Image Reconstruction using Wavelet Transform Sherin C Abraham #1, Ketki Pathak *2, Jigna J Patel #3 # Electronics & Communication department, Gujarat Technological University

More information

CLEANING UP TOXIC WASTE: REMOVING NEFARIOUS CONTRIBUTIONS TO RECOMMENDATION SYSTEMS

CLEANING UP TOXIC WASTE: REMOVING NEFARIOUS CONTRIBUTIONS TO RECOMMENDATION SYSTEMS CLEANING UP TOXIC WASTE: REMOVING NEFARIOUS CONTRIBUTIONS TO RECOMMENDATION SYSTEMS Adam Charles, Ali Ahmed, Aditya Joshi, Stephen Conover, Christopher Turnes, Mark Davenport Georgia Institute of Technology

More information

Learning Splines for Sparse Tomographic Reconstruction. Elham Sakhaee and Alireza Entezari University of Florida

Learning Splines for Sparse Tomographic Reconstruction. Elham Sakhaee and Alireza Entezari University of Florida Learning Splines for Sparse Tomographic Reconstruction Elham Sakhaee and Alireza Entezari University of Florida esakhaee@cise.ufl.edu 2 Tomographic Reconstruction Recover the image given X-ray measurements

More information

The Alternating Direction Method of Multipliers

The Alternating Direction Method of Multipliers The Alternating Direction Method of Multipliers With Adaptive Step Size Selection Peter Sutor, Jr. Project Advisor: Professor Tom Goldstein October 8, 2015 1 / 30 Introduction Presentation Outline 1 Convex

More information

Change-Point Estimation of High-Dimensional Streaming Data via Sketching

Change-Point Estimation of High-Dimensional Streaming Data via Sketching Change-Point Estimation of High-Dimensional Streaming Data via Sketching Yuejie Chi and Yihong Wu Electrical and Computer Engineering, The Ohio State University, Columbus, OH Electrical and Computer Engineering,

More information

Compressive Sensing. A New Framework for Sparse Signal Acquisition and Processing. Richard Baraniuk. Rice University

Compressive Sensing. A New Framework for Sparse Signal Acquisition and Processing. Richard Baraniuk. Rice University Compressive Sensing A New Framework for Sparse Signal Acquisition and Processing Richard Baraniuk Rice University Better, Stronger, Faster Accelerating Data Deluge 1250 billion gigabytes generated in 2010

More information

Parallel and Distributed Sparse Optimization Algorithms

Parallel and Distributed Sparse Optimization Algorithms Parallel and Distributed Sparse Optimization Algorithms Part I Ruoyu Li 1 1 Department of Computer Science and Engineering University of Texas at Arlington March 19, 2015 Ruoyu Li (UTA) Parallel and Distributed

More information

Optimization for Machine Learning

Optimization for Machine Learning Optimization for Machine Learning (Problems; Algorithms - C) SUVRIT SRA Massachusetts Institute of Technology PKU Summer School on Data Science (July 2017) Course materials http://suvrit.de/teaching.html

More information

David G. Luenberger Yinyu Ye. Linear and Nonlinear. Programming. Fourth Edition. ö Springer

David G. Luenberger Yinyu Ye. Linear and Nonlinear. Programming. Fourth Edition. ö Springer David G. Luenberger Yinyu Ye Linear and Nonlinear Programming Fourth Edition ö Springer Contents 1 Introduction 1 1.1 Optimization 1 1.2 Types of Problems 2 1.3 Size of Problems 5 1.4 Iterative Algorithms

More information

A fast weighted stochastic gradient descent algorithm for image reconstruction in 3D computed tomography

A fast weighted stochastic gradient descent algorithm for image reconstruction in 3D computed tomography A fast weighted stochastic gradient descent algorithm for image reconstruction in 3D computed tomography Davood Karimi, Rabab Ward Department of Electrical and Computer Engineering University of British

More information

Main Menu. Summary. sampled) f has a sparse representation transform domain S with. in certain. f S x, the relation becomes

Main Menu. Summary. sampled) f has a sparse representation transform domain S with. in certain. f S x, the relation becomes Preliminary study on Dreamlet based compressive sensing data recovery Ru-Shan Wu*, Yu Geng 1 and Lingling Ye, Modeling and Imaging Lab, Earth & Planetary Sciences/IGPP, University of California, Santa

More information

Compressive Topology Identification of Interconnected Dynamic Systems via Clustered Orthogonal Matching Pursuit

Compressive Topology Identification of Interconnected Dynamic Systems via Clustered Orthogonal Matching Pursuit Compressive Topology Identification of Interconnected Dynamic Systems via Clustered Orthogonal Matching Pursuit Borhan M. Sanandaji, Tyrone L. Vincent, and Michael B. Wakin Abstract In this paper, we consider

More information

Recovery of Piecewise Smooth Images from Few Fourier Samples

Recovery of Piecewise Smooth Images from Few Fourier Samples Recovery of Piecewise Smooth Images from Few Fourier Samples Greg Ongie*, Mathews Jacob Computational Biomedical Imaging Group (CBIG) University of Iowa SampTA 2015 Washington, D.C. 1. Introduction 2.

More information

Compressive Sensing Algorithms for Fast and Accurate Imaging

Compressive Sensing Algorithms for Fast and Accurate Imaging Compressive Sensing Algorithms for Fast and Accurate Imaging Wotao Yin Department of Computational and Applied Mathematics, Rice University SCIMM 10 ASU, Tempe, AZ Acknowledgements: results come in part

More information

Short-course Compressive Sensing of Videos

Short-course Compressive Sensing of Videos Short-course Compressive Sensing of Videos Venue CVPR 2012, Providence, RI, USA June 16, 2012 Organizers: Richard G. Baraniuk Mohit Gupta Aswin C. Sankaranarayanan Ashok Veeraraghavan Part 2: Compressive

More information

Integer Programming ISE 418. Lecture 1. Dr. Ted Ralphs

Integer Programming ISE 418. Lecture 1. Dr. Ted Ralphs Integer Programming ISE 418 Lecture 1 Dr. Ted Ralphs ISE 418 Lecture 1 1 Reading for This Lecture N&W Sections I.1.1-I.1.4 Wolsey Chapter 1 CCZ Chapter 2 ISE 418 Lecture 1 2 Mathematical Optimization Problems

More information

A Nuclear Norm Minimization Algorithm with Application to Five Dimensional (5D) Seismic Data Recovery

A Nuclear Norm Minimization Algorithm with Application to Five Dimensional (5D) Seismic Data Recovery A Nuclear Norm Minimization Algorithm with Application to Five Dimensional (5D) Seismic Data Recovery Summary N. Kreimer, A. Stanton and M. D. Sacchi, University of Alberta, Edmonton, Canada kreimer@ualberta.ca

More information

Introduction to Topics in Machine Learning

Introduction to Topics in Machine Learning Introduction to Topics in Machine Learning Namrata Vaswani Department of Electrical and Computer Engineering Iowa State University Namrata Vaswani 1/ 27 Compressed Sensing / Sparse Recovery: Given y :=

More information

Outline Introduction Problem Formulation Proposed Solution Applications Conclusion. Compressed Sensing. David L Donoho Presented by: Nitesh Shroff

Outline Introduction Problem Formulation Proposed Solution Applications Conclusion. Compressed Sensing. David L Donoho Presented by: Nitesh Shroff Compressed Sensing David L Donoho Presented by: Nitesh Shroff University of Maryland Outline 1 Introduction Compressed Sensing 2 Problem Formulation Sparse Signal Problem Statement 3 Proposed Solution

More information

SPARSE SIGNAL RECONSTRUCTION FROM NOISY COMPRESSIVE MEASUREMENTS USING CROSS VALIDATION. Petros Boufounos, Marco F. Duarte, Richard G.

SPARSE SIGNAL RECONSTRUCTION FROM NOISY COMPRESSIVE MEASUREMENTS USING CROSS VALIDATION. Petros Boufounos, Marco F. Duarte, Richard G. SPARSE SIGNAL RECONSTRUCTION FROM NOISY COMPRESSIVE MEASUREMENTS USING CROSS VALIDATION Petros Boufounos, Marco F. Duarte, Richard G. Baraniuk Rice University, Electrical and Computer Engineering, Houston,

More information

Learning based face hallucination techniques: A survey

Learning based face hallucination techniques: A survey Vol. 3 (2014-15) pp. 37-45. : A survey Premitha Premnath K Department of Computer Science & Engineering Vidya Academy of Science & Technology Thrissur - 680501, Kerala, India (email: premithakpnath@gmail.com)

More information

Convex Optimization M2

Convex Optimization M2 Convex Optimization M2 Lecture 1 A. d Aspremont. Convex Optimization M2. 1/49 Today Convex optimization: introduction Course organization and other gory details... Convex sets, basic definitions. A. d

More information

The Viterbi Algorithm for Subset Selection

The Viterbi Algorithm for Subset Selection 524 IEEE SIGNAL PROCESSING LETTERS, VOL. 22, NO. 5, MAY 2015 The Viterbi Algorithm for Subset Selection Shay Maymon and Yonina C. Eldar, Fellow, IEEE Abstract We study the problem of sparse recovery in

More information

DEGENERACY AND THE FUNDAMENTAL THEOREM

DEGENERACY AND THE FUNDAMENTAL THEOREM DEGENERACY AND THE FUNDAMENTAL THEOREM The Standard Simplex Method in Matrix Notation: we start with the standard form of the linear program in matrix notation: (SLP) m n we assume (SLP) is feasible, and

More information

The flare Package for High Dimensional Linear Regression and Precision Matrix Estimation in R

The flare Package for High Dimensional Linear Regression and Precision Matrix Estimation in R Journal of Machine Learning Research 6 (205) 553-557 Submitted /2; Revised 3/4; Published 3/5 The flare Package for High Dimensional Linear Regression and Precision Matrix Estimation in R Xingguo Li Department

More information

Denoising Using Projection Onto Convex Sets (POCS) Based Framework

Denoising Using Projection Onto Convex Sets (POCS) Based Framework Denoising Using Projection Onto Convex Sets (POCS) Based Framework Mohammad Tofighi, Kivanc Kose, A. Enis Cetin Dept. of Electrical and Electronic Engineering, Bilkent University, Ankara, Turkey Dermatology

More information

Lecture 2 September 3

Lecture 2 September 3 EE 381V: Large Scale Optimization Fall 2012 Lecture 2 September 3 Lecturer: Caramanis & Sanghavi Scribe: Hongbo Si, Qiaoyang Ye 2.1 Overview of the last Lecture The focus of the last lecture was to give

More information