Lecture 17 Sparse Convex Optimization

Similar documents
Modified Iterative Method for Recovery of Sparse Multiple Measurement Problems

Signal Reconstruction from Sparse Representations: An Introdu. Sensing

Lecture 7: Support Vector Machine

I How does the formulation (5) serve the purpose of the composite parameterization

Dantzig selector homotopy with dynamic measurements

Sparsity Based Regularization

ELEG Compressive Sensing and Sparse Signal Representations

Mathematical Programming and Research Methods (Part II)

COMPRESSIVE VIDEO SAMPLING

Lecture 19: November 5

CS231A Course Notes 4: Stereo Systems and Structure from Motion

Collaborative Sparsity and Compressive MRI

An Iteratively Reweighted Least Square Implementation for Face Recognition

ECE 8201: Low-dimensional Signal Models for High-dimensional Data Analysis

Combinatorial Selection and Least Absolute Shrinkage via The CLASH Operator

Convex Optimization. Stephen Boyd

Face Recognition via Sparse Representation

Compressed Sensing Algorithm for Real-Time Doppler Ultrasound Image Reconstruction

Sparse Reconstruction / Compressive Sensing

Robust Principal Component Analysis (RPCA)

Linear and Integer Programming :Algorithms in the Real World. Related Optimization Problems. How important is optimization?

Compressive. Graphical Models. Volkan Cevher. Rice University ELEC 633 / STAT 631 Class

Bilevel Sparse Coding

Clustered Compressive Sensing: Application on Medical Imaging

Solutions for Operations Research Final Exam

1. Introduction. performance of numerical methods. complexity bounds. structural convex optimization. course goals and topics

Convex Optimization: from Real-Time Embedded to Large-Scale Distributed

The convex geometry of inverse problems

RECONSTRUCTION ALGORITHMS FOR COMPRESSIVE VIDEO SENSING USING BASIS PURSUIT

SEQUENTIAL IMAGE COMPLETION FOR HIGH-SPEED LARGE-PIXEL NUMBER SENSING

Package ADMM. May 29, 2018

Stereo Vision. MAN-522 Computer Vision

Mathematical and Algorithmic Foundations Linear Programming and Matchings

Algebraic Iterative Methods for Computed Tomography

Solving basis pursuit: infeasible-point subgradient algorithm, computational comparison, and improvements

Connections between the Lasso and Support Vector Machines

Measurements and Bits: Compressed Sensing meets Information Theory. Dror Baron ECE Department Rice University dsp.rice.edu/cs

Finite Math - J-term Homework. Section Inverse of a Square Matrix

Detecting Burnscar from Hyperspectral Imagery via Sparse Representation with Low-Rank Interference

Robust Face Recognition via Sparse Representation

Compressive Sensing for Multimedia. Communications in Wireless Sensor Networks

P257 Transform-domain Sparsity Regularization in Reconstruction of Channelized Facies

Multiple-View Object Recognition in Band-Limited Distributed Camera Networks

2. Convex sets. x 1. x 2. affine set: contains the line through any two distinct points in the set

Structure from Motion. Introduction to Computer Vision CSE 152 Lecture 10

Convex Optimization MLSS 2015

CS231A Midterm Review. Friday 5/6/2016

Convex Optimization. Convex Sets. ENSAE: Optimisation 1/24

Compressive Sensing: Theory and Practice

Lasso. November 14, 2017

Introduction to Topics in Machine Learning

Advanced Operations Research Techniques IE316. Quiz 2 Review. Dr. Ted Ralphs

Structurally Random Matrices

Outline Introduction Problem Formulation Proposed Solution Applications Conclusion. Compressed Sensing. David L Donoho Presented by: Nitesh Shroff

Dimensionality reduction of MALDI Imaging datasets using non-linear redundant wavelet transform-based representations

Using. Adaptive. Fourth. Department of Graduate Tohoku University Sendai, Japan jp. the. is adopting. was proposed in. and

Lecture 19. Lecturer: Aleksander Mądry Scribes: Chidambaram Annamalai and Carsten Moldenhauer

Algebraic Iterative Methods for Computed Tomography

Randomized sampling strategies

2. Convex sets. affine and convex sets. some important examples. operations that preserve convexity. generalized inequalities

CS 395T Lecture 12: Feature Matching and Bundle Adjustment. Qixing Huang October 10 st 2018

The Fundamentals of Compressive Sensing

Gradient LASSO algoithm

Dense 3D Reconstruction. Christiano Gava

calibrated coordinates Linear transformation pixel coordinates

Sparse coding for image classification

ADAPTIVE LOW RANK AND SPARSE DECOMPOSITION OF VIDEO USING COMPRESSIVE SENSING

Constrained optimization

6 Randomized rounding of semidefinite programs

CoSaMP: Iterative Signal Recovery from Incomplete and Inaccurate Samples By Deanna Needell and Joel A. Tropp

Convex and Distributed Optimization. Thomas Ropars

Sparse wavelet expansions for seismic tomography: Methods and algorithms

Non-Differentiable Image Manifolds

Fast Algorithms for Structured Sparsity (ICALP 2015 Invited Tutorial)

Convex Sets. CSCI5254: Convex Optimization & Its Applications. subspaces, affine sets, and convex sets. operations that preserve convexity

EE/ACM Applications of Convex Optimization in Signal Processing and Communications Lecture 6

Machine Learning: Think Big and Parallel

DS Machine Learning and Data Mining I. Alina Oprea Associate Professor, CCIS Northeastern University

The convex geometry of inverse problems

Tomographic reconstruction: the challenge of dark information. S. Roux

Srikumar Ramalingam. Review. 3D Reconstruction. Pose Estimation Revisited. School of Computing University of Utah

3D Computer Vision. Dense 3D Reconstruction II. Prof. Didier Stricker. Christiano Gava

Image reconstruction based on back propagation learning in Compressed Sensing theory

Hyperspectral Data Classification via Sparse Representation in Homotopy

The Alternating Direction Method of Multipliers

Introduction. Wavelets, Curvelets [4], Surfacelets [5].

Two-View Geometry (Course 23, Lecture D)

Alternating Direction Method of Multipliers

SVMs for Structured Output. Andrea Vedaldi

- - Tues 14 March Tyler Neylon

SpicyMKL Efficient multiple kernel learning method using dual augmented Lagrangian

Aim. Structure and matrix sparsity: Part 1 The simplex method: Exploiting sparsity. Structure and matrix sparsity: Overview

Multidimensional scaling

Compressive Sensing based image processing in TrapView pest monitoring system

Robust Seismic Image Amplitude Recovery Using Curvelets

DM545 Linear and Integer Programming. Lecture 2. The Simplex Method. Marco Chiarandini

COMPRESSED SENSING: A NOVEL POLYNOMIAL COMPLEXITY SOLUTION TO NASH EQUILIBRIA IN DYNAMICAL GAMES. Jing Huang, Liming Wang and Dan Schonfeld

One-norm regularized inversion: learning from the Pareto curve

Sparsity and image processing

Variable Selection 6.783, Biomedical Decision Support

Transcription:

Lecture 17 Sparse Convex Optimization

Compressed sensing

A short introduction to Compressed Sensing An imaging perspective 10 Mega Pixels Scene Image compression Picture Why do we compress images?

Introduction to Compressed Sensing Images are compressible Because Only certain part of information is important (e.g. objects and their edges) Some information is unwanted (e.g. noise) Image compression Take an input image u Pick a good dictionary Find a sparse representation x of u such that x - u 2 is small Save x This is traditional compression.

Introduction to Compressed Sensing An imaging perspective 10 Mega Pixels This is traditional compression. 100 Kilobytes

Introduction to Compressed Sensing An imaging perspective n: 10 Mega Pixels This is traditional compression. k:100 Kilobytes

Introduction to Compressed Sensing If only 100 kilobytes are saved, why do we need a 10-megapixel camera in the first place? Answer: a traditional compression algorithm needs the complete image to compute and x Can we do better than this?

Introduction to Compressed Sensing Let k= x 0, n=dim(x)=dim(u). In compressed sensing based on l 1 minimization, the number of measurements is m=o(k log(n/k)) (Donoho, Candés-Tao)

Introduction to Compressed Sensing Input Linear encoding Signal acquisition Signal reconstruction Signal representation

Introduction to Compressed Sensing Input Linear encoding Signal acquisition Signal reconstruction Signal representation

Introduction to Compressed Sensing Input: b=bu=b x, A=B Output: x In compressed sensing, m=dim(b)<<dim(u)=dim(x)=n Therefore, Ax = b is an underdetermined system Approaches for recovering x (hence the image u): Solve min x 0, subject to Ax = b Solve min x 1, subject to Ax = b Other approaches

Large scales Completely dense data: A Difficulties However Solutions x are expected to be sparse The matrices A are often fast transforms

Sparse signal reconstruction Recovery by using the l 1 -norm Sparse signal, matrix The system is underdetermined, but if card(x)<m, can recover signal. The problem is NP-hard in general. Typical relaxation,

Signal recovery Shown by Candes & Tao and Donoho that under certain conditions on matrix A the sparse signal is recovered exactly by solving the convex relaxation The matrix property is called restricted isometry property

Restricted Isometry Property A vector is said to be s-sparse if it has at most s nonzero entries. For a given s the isometric constant of a matrix A is the smallest constant such that for any s-sparse.

Why 1 norm? Ax = b `0 ball `1 ball `2 ball

Why 1 norm? Ax = b

Why 1 norm? Ax = b

Sparse regularized regression

Least Squares Linear Regression

Standard form of LS problem Least squares problem Includes solution of a system of linear equations Ax=b. May be used with additional linear constraints, e.g. Ridge regression is the regularization parameter the trade-off weight.

Robust least squares regression Less straightforward than for SVM but it is possible to show that the above problem leads to Another interpretation feature selection

Lasso and other formulations Sparse regularized regression or Lasso: Sparse regressor selection Noisy signal recovery

Connection between different formulations If solution exists

Types of convex problems Variable substitution: Linear programming problem

Types of convex problems Variable substitution: Convex non-smooth objective with linear inequality constraints

Types of convex problems Convex QP with linear inequality constraints SOCP

Optimization approaches

Lasso Regularized regression or Lasso: Convex QP with nonnegativity constraints

Standard QP formulation Reformulate as How is it different from SVMs dual QP?

Reformulate as Standard QP formulation

Standard QP formulation Reformulate as Features of this QP 1. Q=M T M, where M is m n, with n>>m. 2. Forming Q is O(m 2 n), factorizing Q+D is O(m 3 ) 3. There are no upper bound constraints. IPM complexity is O (m 3 ) per iteration

Dual Problem

Using: Dual Problem

Primal-Dual pair of problems Lasso

Optimality Conditions