Modern Signal Processing and Sparse Coding

Similar documents
CSC 411 Lecture 18: Matrix Factorizations

Learning Sparse Codes for Hyperspectral Imagery

Spectral Super-Resolution of Hyperspectral Imagery Using Reweighted l 1 Spatial Filtering

Frame based kernel methods for hyperspectral imagery data

Perception as an Inference Problem

Survey of the Mathematics of Big Data

Efficient Visual Coding: From Retina To V2

Bilevel Sparse Coding

Topographic Mapping with fmri

C. Poultney S. Cho pra (NYU Courant Institute) Y. LeCun

Signal Reconstruction from Sparse Representations: An Introdu. Sensing

A Computational Approach To Understanding The Response Properties Of Cells In The Visual System

Introduction to Topics in Machine Learning

Detecting Burnscar from Hyperspectral Imagery via Sparse Representation with Low-Rank Interference

TERM PAPER ON The Compressive Sensing Based on Biorthogonal Wavelet Basis

Machine Learning for NLP

Autoencoder Using Kernel Method

Sparse Models in Image Understanding And Computer Vision

Machine Learning for Physicists Lecture 6. Summer 2017 University of Erlangen-Nuremberg Florian Marquardt

Spatial Enhancement Definition

Measurements and Bits: Compressed Sensing meets Information Theory. Dror Baron ECE Department Rice University dsp.rice.edu/cs

Compressive Sensing for Multimedia. Communications in Wireless Sensor Networks

G Practical Magnetic Resonance Imaging II Sackler Institute of Biomedical Sciences New York University School of Medicine. Compressed Sensing

Structurally Random Matrices

IMAGE ANALYSIS, CLASSIFICATION, and CHANGE DETECTION in REMOTE SENSING

Dimension reduction for hyperspectral imaging using laplacian eigenmaps and randomized principal component analysis:midyear Report

ADAPTIVE LOW RANK AND SPARSE DECOMPOSITION OF VIDEO USING COMPRESSIVE SENSING

Sparse Reconstruction / Compressive Sensing

CS 534: Computer Vision Segmentation and Perceptual Grouping

Learning with infinitely many features

Neural Networks for Machine Learning. Lecture 15a From Principal Components Analysis to Autoencoders

Application of Daubechies Wavelets for Image Compression

Filtering, scale, orientation, localization, and texture. Nuno Vasconcelos ECE Department, UCSD (with thanks to David Forsyth)

CLASSIFICATION AND CHANGE DETECTION

Computing and Processing Correspondences with Functional Maps

The convex geometry of inverse problems

Detecting Salient Contours Using Orientation Energy Distribution. Part I: Thresholding Based on. Response Distribution

The Sparse Manifold Transform

Computational Perception. Visual Coding 3

DIMENSION REDUCTION FOR HYPERSPECTRAL DATA USING RANDOMIZED PCA AND LAPLACIAN EIGENMAPS

Optimised corrections for finite-difference modelling in two dimensions

SPARSE CODES FOR NATURAL IMAGES. Davide Scaramuzza Autonomous Systems Lab (EPFL)

Computational Photography Denoising

Image Restoration and Background Separation Using Sparse Representation Framework

Data-Driven Modeling. Scientific Computation J. NATHAN KUTZ OXPORD. Methods for Complex Systems & Big Data

From natural scene statistics to models of neural coding and representation (part 1)

Compressive Sensing: Theory and Practice

Separation of speech mixture using time-frequency masking implemented on a DSP

FACE RECOGNITION USING INDEPENDENT COMPONENT

- - Tues 14 March Tyler Neylon

Extended Dictionary Learning : Convolutional and Multiple Feature Spaces

CHAPTER 9 INPAINTING USING SPARSE REPRESENTATION AND INVERSE DCT

Compressive Sensing of High-Dimensional Visual Signals. Aswin C Sankaranarayanan Rice University

Robust Face Recognition via Sparse Representation

Natural image statistics and efficient coding

IMAGE DE-NOISING IN WAVELET DOMAIN

A Hierarchial Model for Visual Perception

EE123 Digital Signal Processing

3. Image formation, Fourier analysis and CTF theory. Paula da Fonseca

Image Analysis, Classification and Change Detection in Remote Sensing

Voronoi Region. K-means method for Signal Compression: Vector Quantization. Compression Formula 11/20/2013

SPARSE COMPONENT ANALYSIS FOR BLIND SOURCE SEPARATION WITH LESS SENSORS THAN SOURCES. Yuanqing Li, Andrzej Cichocki and Shun-ichi Amari

Big Data Methods. Chapter 5: Machine learning. Big Data Methods, Chapter 5, Slide 1

Neural Networks for unsupervised learning From Principal Components Analysis to Autoencoders to semantic hashing

2003 Steve Marschner 7 Light detection discrete approx. Cone Responses S,M,L cones have broadband spectral sensitivity This sum is very clearly a dot

Nonparametric sparse hierarchical models describe V1 fmri responses to natural images

Diffusion Wavelets for Natural Image Analysis

Sparsity Based Regularization

Deep Learning of Compressed Sensing Operators with Structural Similarity Loss

Sequential Maximum Entropy Coding as Efficient Indexing for Rapid Navigation through Large Image Repositories

Support Vector Machines

This research aims to present a new way of visualizing multi-dimensional data using generalized scatterplots by sensitivity coefficients to highlight

Simple Model Selection Cross Validation Regularization Neural Networks

Advances in Neural Information Processing Systems, 1999, In press. Unsupervised Classication with Non-Gaussian Mixture Models using ICA Te-Won Lee, Mi

Image Transformation Techniques Dr. Rajeev Srivastava Dept. of Computer Engineering, ITBHU, Varanasi

Collaborative Sparsity and Compressive MRI

10601 Machine Learning. Model and feature selection

Compressive Sensing for High-Dimensional Data

Image reconstruction based on back propagation learning in Compressed Sensing theory

HYPERSPECTRAL REMOTE SENSING

All lecture slides will be available at CSC2515_Winter15.html

Schedule for Rest of Semester

Optimization Methods for Machine Learning (OMML)

RECONSTRUCTION ALGORITHMS FOR COMPRESSIVE VIDEO SENSING USING BASIS PURSUIT

CS 179 Lecture 16. Logistic Regression & Parallel SGD

MODEL SELECTION AND REGULARIZATION PARAMETER CHOICE

COMPUTATIONAL INTELLIGENCE (INTRODUCTION TO MACHINE LEARNING) SS18. Lecture 2: Linear Regression Gradient Descent Non-linear basis functions

COMPUTATIONAL INTELLIGENCE (CS) (INTRODUCTION TO MACHINE LEARNING) SS16. Lecture 2: Linear Regression Gradient Descent Non-linear basis functions

Improvement of optic flow estimation from an event-based sensor with recurrently connected processing layers

Spectral Classification

Seismic regionalization based on an artificial neural network

An Intuitive Explanation of Fourier Theory

Dimension reduction for hyperspectral imaging using laplacian eigenmaps and randomized principal component analysis

Compressive Sensing: Opportunities and Perils for Computer Vision

Introduction to Mixed Models: Multivariate Regression

Variable Selection 6.783, Biomedical Decision Support

Sparse sampling in MRI: From basic theory to clinical application. R. Marc Lebel, PhD Department of Electrical Engineering Department of Radiology

Saliency Detection for Videos Using 3D FFT Local Spectra

Sparse wavelet expansions for seismic tomography: Methods and algorithms

FMA901F: Machine Learning Lecture 3: Linear Models for Regression. Cristian Sminchisescu

Transcription:

Modern Signal Processing and Sparse Coding School of Electrical and Computer Engineering Georgia Institute of Technology March 22 2011

Reason D etre Modern signal processing Signals without cosines? Sparse Coding Some applications Theoretical Neuroscience Hyperspectral Images

I require the use of a blade Occam s Razor: A parsimonious shave every time! Given many hypotheses that explain the observations, pick the simplest one!

What s in a definition What is signal processing? Frequency analysis? Filter theory? Applied linear algebra? Signal processing... deals with operations on or analysis of signals... to perform useful operations on those signals. -Wikipedia (you know you d look here first) Methods to extract useful information from measurements of the world around us.

Signals... What is this signal we re processing anyway? Well, what are you interested in? Images Video Music Radar Neurons Inertial measurement units Anything that can be represented by vectors!

...and Systems And all this processing? Again: What interests you? Object classification Motion detection Direction of Arrival Coding Schemes Position determination (The list goes on...)

Two examples: Inverse problems

Classical Fourier Need a principled way to describe signals Classical decomposition: frequency domain F o r i e r

Beyond the Cosine Sine and cosine don t represent all signals well? Example: need a LOT of cosines to represent one discontinuity

Beyond the Cosine 2 An Image example: Original Image FFT: 10K coefs FFT: 25K coefs Wavelet: 5K coefs Wavelet: 10K coefs Wavelet: 25K coefs

Beyond the Cosine 3 Use other shapes: Other orthonormal bases (Gram-Schmidt your favorite functions) Wavelets (localized shapes: still orthonormal) (Tight) frames - more kernels then dimensions!

Sparse Coding Sparse Coding =

Beliefs and Morals Remember Occam s razor? Let s say simple is low dimensional May have a large vector, but in some linear transformation, many of the elements are zero!!

Formalities Signal: y R M can write as y = Φx + ɛ x R N (typically M < N) Overcomplete so choice in x: Choose sparse x

Less Formal Can visualize as: =

Finding Coefficients Your optimal inner product depends on your own system of beliefs and morals. - Dr. J. Romberg Linear (least-squares) version: Pseudo Inverse (or BLUE): ˆx = arg min [ y Φx 2 + λ x 2 2 ] x Optimize l 0 regularized least squares (force sparsity) ˆx = arg min [ y Φx 2 + λ x 2 0 ] x Wonderful result: can optimize l 1 regularized least squares instead ˆx = arg min [ y Φx 2 + λ x 2 1 ] x Now it s convex! (fast solvers)

Aside: Ties to Compressive Sensing y are measurements Φ is a random sensing matrix If x is s-sparse (or s-sparse in some known basis) we can beat the Shannon-Nyquist sampling theorem!

There s an App for that! What we do in the Neurolab: Brains Analog systems Audio Dynamics Manifold Embedding

Brains!! Computational Neuroscience: How can we use mathematical principals (mathematics, engineering, physics) to understand neural systems? The brain takes in sensory information and extracts relevant information Big issue: sensory coding We look at the visual cortex

Visual Pathway Has to interpret natural scenes Many layers (LGN V1 V2,3,4...) (Hubel 1988)

Receptive Fields Encodes via projections onto receptive fields VERY overcomplete set

Non-Linear Effects Inherently linear coding Should be easy enough to test

Surround Suppression Changing field of view changes activity?! (Vinje and Gallant 2002)

Surround Suppression Respons 10 5 0 1 1.5 2 2.5 3 3.5 4 4.5 5 5.5 6 6.5 7 7.5 Radius (pixels) (Mengchen and Rozell 2011) Changing field of view changes contrast tuning?! (Alitto and Dan 2010)

Efficient Coding Conclusion: Linear projections are WRONG Natural images have special structure (remember the puppy) Efficient coding hypothesis: Visual Cortex takes advantage of this structure (perhaps using the overcomplete nature of the receptive fields) Can sparse coding explain this? (Barlow 1961)

Optimize for Natural Images What Φ is best for Natural Images? Take example data and optimize: ˆX, ˆΦ = arg min X,Φ [ Y ΦX 2 F + λ k x k 1 ]

Best Kernels? Receptive Fields! Olshausen and Field: optimized and found: (Olshausen and Field 1996)

Analog Systems Need a biologically plausible inference scheme: Locally Competitive Algorithm Find x in an analog system (Rozell et al. 2008)

Analog Systems II Neural implementation Faster than digital - maybe low power

HSI Images Hyperspectral Imagery

HSI Images HSI captures detailed (spectrally) ground images

Unmixing HSI Solve What materials were mixed to create the measurement y = Φx + ɛ Unmixing: optimize coefficients Try sparse priors: ˆx = arg min [ y Φx 2 + λ x 2 1 ] x

Learning Φ Need Φ for the optimization Can use measures spectra in a lab OR: can learn it from the data! ˆX, ˆΦ = arg min X,Φ [ Y ΦX 2 F + λ k x k 1 ]

Learned Spectra And: we recover materials! Northshore Water Sand Alterniflora Reflectance Submerged Net Pine Phragmites Relative Height Wavelength (μm)

Super-Resolution Recover the material coefficients with â = arg min [ y BΦa 2 + λ a 2 1 ] x Then recover the spectrum with ˆx = Φâ

Super-Resolution HSI Resolution Coarse HSI Spectrum Inferred Coefficients Reflectance Summed Reflectance Wavelength (μm) Magnitude Coefficient Number

Classification Results Benefits to multi-class material classification 50-60% faster classification time for same % error Can classify better with smaller training sets Sparse coding is a more general representation!

Closing Remarks Can analyze data in many different ways Simplicity is not a bad thing! Consider what you want to do: then choose the best methods. Signal Processing is not solved - There s a lot of research to do (*nudge**nudge*)

Acknowledgments Thanks to: Dr. Christopher Rozell Dr. Bruno Olshausen Dr. Justin Romberg Han-Lun Yap, Abbie Kressner, Mengchen Zhu

Questions/Comments?