Graph Connectivity in Sparse Subspace Clustering

Similar documents
Generalized Principal Component Analysis CVPR 2007

Numerical Optimization

Motion Segmentation by SCC on the Hopkins 155 Database

Chapter 4 Concepts from Geometry

Generalized Principal Component Analysis via Lossy Coding and Compression Yi Ma

A Geometric Analysis of Subspace Clustering with Outliers

Linear Programming in Small Dimensions

CS 372: Computational Geometry Lecture 10 Linear Programming in Fixed Dimension

Combined Central and Subspace Clustering for Computer Vision Applications

Lecture 2 - Introduction to Polytopes

Preferred directions for resolving the non-uniqueness of Delaunay triangulations

HIGH-dimensional data are commonly observed in various

Topology 550A Homework 3, Week 3 (Corrections: February 22, 2012)

Lecture 4: Linear Programming

IMA Preprint Series # 2339

Combinatorial Geometry & Topology arising in Game Theory and Optimization

HIGH-DIMENSIONAL data are ubiquitous in many areas of

Lecture 2 September 3

Low Rank Representation Theories, Algorithms, Applications. 林宙辰 北京大学 May 3, 2012

The past few years have witnessed an explosion

Robust Subspace Clustering via Half-Quadratic Minimization

Surface Mesh Generation

Latent Space Sparse Subspace Clustering

ORIE 6300 Mathematical Programming I September 2, Lecture 3

The Ordered Residual Kernel for Robust Motion Subspace Clustering

Applied Integer Programming

DM545 Linear and Integer Programming. Lecture 2. The Simplex Method. Marco Chiarandini

Computational Geometry

CS 473: Algorithms. Ruta Mehta. Spring University of Illinois, Urbana-Champaign. Ruta (UIUC) CS473 1 Spring / 29

Generalized Transitive Distance with Minimum Spanning Random Forest

3D Motion Segmentation Using Intensity Trajectory

Subspace Clustering. Weiwei Feng. December 11, 2015

Nonparametric estimation of multiple structures with outliers

arxiv: v1 [cs.cv] 12 Apr 2017

274 Curves on Surfaces, Lecture 5

Convex Geometry arising in Optimization

NULL SPACE CLUSTERING WITH APPLICATIONS TO MOTION SEGMENTATION AND FACE CLUSTERING

Convex Sets. Pontus Giselsson

The Simplex Algorithm for LP, and an Open Problem

Convex hulls of spheres and convex hulls of convex polytopes lying on parallel hyperplanes

Voronoi diagram and Delaunay triangulation

In class 75min: 2:55-4:10 Thu 9/30.

The convex geometry of inverse problems

Learning a Manifold as an Atlas Supplementary Material

Math 5593 Linear Programming Lecture Notes

Sparse Manifold Clustering and Embedding

Three Dimensional Geometry. Linear Programming

Notes on metric spaces and topology. Math 309: Topics in geometry. Dale Rolfsen. University of British Columbia

Mathematical and Algorithmic Foundations Linear Programming and Matchings

Latent Low-Rank Representation

Robust Subspace Segmentation by Low-Rank Representation

EC 521 MATHEMATICAL METHODS FOR ECONOMICS. Lecture 2: Convex Sets

On Order-Constrained Transitive Distance

CS675: Convex and Combinatorial Optimization Spring 2018 The Simplex Algorithm. Instructor: Shaddin Dughmi

Graph Connectivity in Noisy Sparse Subspace Clustering

Nonparametric estimation of multiple structures with outliers

Lecture 1 Discrete Geometric Structures

Subspace Clustering with Global Dimension Minimization And Application to Motion Segmentation

Voronoi Diagrams, Delaunay Triangulations and Polytopes

Point Cloud Simplification and Processing for Path-Planning. Master s Thesis in Engineering Mathematics and Computational Science DAVID ERIKSSON

Integral Geometry and the Polynomial Hirsch Conjecture

Stereo and Epipolar geometry

Area of Lattice Point Polygons. Andrejs Treibergs. Wednesday, January 16, 2019

Automatic Kinematic Chain Building from Feature Trajectories of Articulated Objects

maximize c, x subject to Ax b,

Motion Segmentation via Robust Subspace Separation in the Presence of Outlying, Incomplete, or Corrupted Trajectories

Localized K-flats. National University of Defense Technology, Changsha , China wuyi

FACES OF CONVEX SETS

Rigid Motion Segmentation using Randomized Voting

Numerical Analysis and Statistics on Tensor Parameter Spaces

Computational Geometry

Lecture 3. Corner Polyhedron, Intersection Cuts, Maximal Lattice-Free Convex Sets. Tepper School of Business Carnegie Mellon University, Pittsburgh

Exact adaptive parallel algorithms for data depth problems. Vera Rosta Department of Mathematics and Statistics McGill University, Montreal

Spectral clustering of linear subspaces for motion segmentation

Open problems in convex geometry

AMS : Combinatorial Optimization Homework Problems - Week V

Convexity: an introduction

Curvature Berkeley Math Circle January 08, 2013

A non-negative representation learning algorithm for selecting neighbors

2 Geometry Solutions

INFO0948 Fitting and Shape Matching

Outline Introduction Problem Formulation Proposed Solution Applications Conclusion. Compressed Sensing. David L Donoho Presented by: Nitesh Shroff

COMP331/557. Chapter 2: The Geometry of Linear Programming. (Bertsimas & Tsitsiklis, Chapter 2)

Other Voronoi/Delaunay Structures

arxiv: v1 [cs.cv] 27 Apr 2014

Advanced Operations Research Techniques IE316. Quiz 1 Review. Dr. Ted Ralphs

Euler s Theorem. Brett Chenoweth. February 26, 2013

Face Recognition via Sparse Representation

FLoSS: Facility Location for Subspace Segmentation

Linear and Integer Programming :Algorithms in the Real World. Related Optimization Problems. How important is optimization?

Variable Selection 6.783, Biomedical Decision Support

Tripod Configurations

3. The Simplex algorithmn The Simplex algorithmn 3.1 Forms of linear programs

be a polytope. has such a representation iff it contains the origin in its interior. For a generic, sort the inequalities so that

COMPUTATIONAL GEOMETRY

Math 414 Lecture 2 Everyone have a laptop?

Lesson 05. Mid Phase. Collision Detection

Flavor of Computational Geometry. Convex Hull in 2D. Shireen Y. Elhabian Aly A. Farag University of Louisville

Finding Exemplars from Pairwise Dissimilarities via Simultaneous Sparse Recovery

ELEG Compressive Sensing and Sparse Signal Representations

Transcription:

www.nicta.com.au From imagination to impact Graph Connectivity in Sparse Subspace Clustering Behrooz Nasihatkon 24 March 2011

Outline 1 Subspace Clustering 2 Sparse Subspace Clustering 3 Graph Connectivity 4 Conclusion 2/35

Outline 1 Subspace Clustering Example Subspace Clustering Applications Solutions 2 Sparse Subspace Clustering 3 Graph Connectivity 4 Conclusion 3/35

Example: Rigid Body Frame 1: x i = [ A 1 [ ] Xi ] 2 4 1 4 1 4/35

Example: Rigid Body Frame 1: x i = [ A 1 [ ] Xi ] 2 4 1 4 1 ] [x 1 x n] = [ A 1 ] 2 4 [ X1 X n 1 1 4 n 4/35

Example: Rigid Body Frame 1: Frame 2: x i = [ A 1 [ ] Xi ] 2 4 1 4 1 ] [x 1 x n] = [ A 1 ] 2 4 [ X1 X n 1 1 4 n 4/35

Example: Rigid Body Frame 1: Frame 2: x i = [ A 1 [ ] Xi ] 2 4 1 4 1 ] [x 1 x n] = [ A 1 ] 2 4 [ X1 X n 1 1 4 n y i = [ A 2 [ ] Xi ] 2 4 1 4 1 4/35

Example: Rigid Body Frame 1: Frame 2: x i = [ A 1 [ ] Xi ] 2 4 1 4 1 ] [x 1 x n] = [ A 1 ] 2 4 [ X1 X n 1 1 4 n y i = [ A 2 [ ] Xi ] 2 4 1 4 1 ] [y 1 y n] = [ A 2 ] 2 4 [ X1 X n 1 1 4 n 4/35

Example: Rigid Body Frame 1: Frame 2: x i = [ A 1 [ ] Xi ] 2 4 1 4 1 ] [x 1 x n] = [ A 1 ] 2 4 [ X1 X n 1 1 4 n y i = [ A 2 [ ] Xi ] 2 4 1 4 1 [ ] [ ] [ x1 x n A1 X1 X = n y 1 y n A 2 1 1 4 4 ] 4 n 4/35

Example: Rigid Body Frame 1: Frame 2: x i = [ A 1 [ ] Xi ] 2 4 1 4 1 ] [x 1 x n] = [ A 1 ] 2 4 [ X1 X n 1 1 4 n y i = [ A 2 x 1 x n y 1 y n z 1 z n = [ ] Xi ] 2 4 1 A 1 A 2 A 3 6 4 4 1 [ X1 X n 1 1 ] 4 n 4/35

Example: Rigid Body Frame 1: Frame 2: x i = [ A 1 [ ] Xi ] 2 4 1 4 1 ] [x 1 x n] = [ A 1 ] 2 4 [ X1 X n 1 1 4 n y i = [ A 2 [ ] Xi ] 2 4 1 4 1 x 1 x n A 1 y 1 y n A.. = 2 [ ] X1 X n. 1 1 4 n z 1 z n A F 2F 4 4/35

Example: Rigid Body x 11 x 1n x 21 x 2n.. x F 1 x Fn 2F n = A 1 A 2. A F 2F 4 [ ] X1 X n 1 1 4 n 5/35

Example: Rigid Body x 11 x 1n x 21 x 2n.. x F 1 x Fn 2F n =? 5/35

Subspace Clustering Data lying on a mixture of subspaces. 6/35

Subspace Clustering Data lying on a mixture of subspaces. Number of subspaces 6/35

Subspace Clustering Data lying on a mixture of subspaces. No. of subspaces + their dimensions 6/35

Subspace Clustering Data lying on a mixture of subspaces. No. of subspaces + dimensions + A basis for each subspace 6/35

Subspace Clustering Data lying on a mixture of subspaces. No. of subspaces + dimensions + bases + data segmentation 6/35

Subspace Clustering Data lying on a mixture of subspaces. No. of subspaces + dimensions + bases + segmentation 6/35

Applications Motion Segmentation [Rene Vidal et al. 2008] Video Shot Segmentation [Le Lu and R. Vidal 2006] Illumination Invariant Clustering [J. Ho et al. 2003] Image Segmentation [Alen Yang et al. 2008] Image Representation and Compression [Wei Hong et al. 2005] Linear Hybrid Systems Identification [Rene Vidal et al. 2003] 7/35

Solutions Random Sample Consensus (RANSAC) [Martin Fischler and R. Bolles 1981] Mixture of Probabilistic PCA [Michael Tipping and C. Bishop 1999] Generalized PCA (GPCA) [Rene Vidal et al. 2005] Locally Linear Manifold Clustering (LLMC) [Alvina Goh and R. Vidal 2007] Agglomerative Lossy Compression (ALC) [Yi Ma et al. 2007] Sparse Subspace Clustering (SSC) [Ehsan Elhamifar and R. Vidal 2009] Low-Rank Subspace Clustering (LLR) [Guangcan Li et al. 2010] 8/35

Outline 1 Subspace Clustering 2 Sparse Subspace Clustering Sparse Representation Main Theorem Noise and Outliers Open Problems 3 Graph Connectivity 4 Conclusion 9/35

Sparse Representation Represent y as a linear combination of the smallest possible subset of the vectors {x 1, x 2,..., x n }. 10/35

Sparse Representation Represent y as a linear combination of the smallest possible subset of the vectors {x 1, x 2,..., x n }. min a 0 s.t. y = Xa where X = [x 1 x 2 x n ]. 10/35

Sparse Representation Represent y as a linear combination of the smallest possible subset of the vectors {x 1, x 2,..., x n }. min a 0 s.t. y = Xa where X = [x 1 x 2 x n ]. NP-hard 10/35

Sparse Representation Represent y as a linear combination of the smallest possible subset of the vectors {x 1, x 2,..., x n }. min a 0 s.t. y = Xa where X = [x 1 x 2 x n ]. NP-hard Use L 1 -minimization: min a 1 s.t. y = Xa 10/35

Sparse Representation Represent y as a linear combination of the smallest possible subset of the vectors {x 1, x 2,..., x n }. min a 0 s.t. y = Xa where X = [x 1 x 2 x n ]. NP-hard Use L 1 -minimization: min a 1 s.t. y = Xa L 1 /L 0 equivalence 10/35

Sparse Subspace Clustering If we had the basis [b 1 b 2... b m ] 11/35

Sparse Subspace Clustering If we had the basis [b 1 b 2... b m ] Represent each x i as a sparse combination of {b 1 b 2... b m } 11/35

Sparse Subspace Clustering We do not have the basis 11/35

Sparse Subspace Clustering We do not have the basis Represent each x i as a sparse combination of X {x i } 11/35

Main Theorem For each x i solve: a i = arg min a 1 s.t. x i = X a, a i = 0 12/35

Main Theorem For each x i solve: a i = arg min a 1 s.t. x i = X a, a i = 0 Construct a graph whose nodes are x i and each node x i is connected to the node x j if the j-th element of a i is nonzero. 12/35

Main Theorem For each x i solve: a i = arg min a 1 s.t. x i = X a, a i = 0 Construct a graph whose nodes are x i and each node x i is connected to the node x j if the j-th element of a i is nonzero. Theorem If the subspaces are independent, the graphs of different subspaces are disconnected. 12/35

Main Theorem For each x i solve: a i = arg min a 1 s.t. x i = X a, a i = 0 Construct a graph whose nodes are x i and each node x i is connected to the node x j if the j-th element of a i is nonzero. Theorem If the subspaces are independent, the graphs of different subspaces are disconnected. Find the connected components 12/35

Main Theorem For each x i solve: a i = arg min a 1 s.t. x i = X a, a i = 0 Construct a graph whose nodes are x i and each node x i is connected to the node x j if the j-th element of a i is nonzero. Theorem If the subspaces are independent, the graphs of different subspaces are disconnected. Find the connected components In practice spectral clustering using [a 1 a 2... a n ] 12/35

Example Figure: An example of an SSC graph 13/35

Noise and Outliers Noise min a 1 +λ e 2 s.t. x i = X i a + e a i = 0 14/35

Noise and Outliers Noise min a 1 +λ e 2 s.t. x i = X i a + e a i = 0 Outliers min a 1 +λ e 1 s.t. x i = X i a + e a i = 0 14/35

Open Problems Noise and Outliers Extension to manifolds Graph Connectivity in each subspace 15/35

Outline 1 Subspace Clustering 2 Sparse Subspace Clustering 3 Graph Connectivity Example Preparation 2D Case 3D Case N-D Case 4 Conclusion 16/35

A Simple Example 17/35

A Simple Example 17/35

A Simple Example 17/35

A Simple Example 17/35

A Simple Example 17/35

Preparation Adding Negative Points x i = j i a jx j X 18/35

Preparation Adding Negative Points x i = j i a jx j X ± 18/35

Preparation Adding Negative Points x i = j i a jx j a j x j = a j ( x j ) X ± 18/35

Preparation Adding Negative Points x i = j i a jx j a j x j = a j ( x j ) a j 0 X ± 18/35

Preparation a i = arg min a 1 s.t. x i = X a, a i = 0 a i = arg min 1 T a s.t. x i = X ± a, a 0, a i = 0 19/35

Geometry of Polytopes L 1 minimization and geometry of polytopes [David Donoho 2005] 20/35

Geometry of Polytopes L 1 minimization and geometry of polytopes [David Donoho 2005] b x i = X i b = X i b 1 b 1 20/35

Geometry of Polytopes L 1 minimization and geometry of polytopes [David Donoho 2005] b x i = X i b = X i b 1 b 1 x i = X i p α, where p = 1, p 0 α : to be minimized 20/35

Geometry of Polytopes L 1 minimization and geometry of polytopes [David Donoho 2005] b x i = X i b = X i b 1 b 1 x i = X i p α, where p = 1, p 0 α : to be minimized X i p : convex hull of X i 20/35

Geometry of Polytopes maximize β s.t. βx i hull(x i ) 21/35

Assumptions Indivisible subspaces? 22/35

Assumptions Indivisible subspaces? Degenerate Cases 22/35

Assumptions Indivisible subspaces? Degenerate Cases Assumption No d points lie in a (d 1)-dimensional subspace. 22/35

Assumptions Indivisible subspaces? Degenerate Cases Assumption No d points lie in a (d 1)-dimensional subspace. Assumption The facet of each polytope(x i ) on which y i lies has exactly d points on it. 22/35

Neighbourhood Cones 23/35

Neighbourhood Cones Theorem Two points are neighbours iff their neighbourhood cones strictly intersect. 23/35

Projection on Unit Hypersphere 24/35

2D Case 25/35

2D Case 25/35

2D Case 25/35

2D Case 25/35

2D Case 25/35

2D Case 25/35

2D Case 25/35

2D Case 25/35

3D Case 26/35

3D Case 26/35

3D Case 26/35

3D Case 26/35

3D Case residual holes for one connected component Topologically open disks. Area: < half-sphere. ( Gauss-Bonnet Theorem) 26/35

What about d 4? No residual holes Search for counterexamples 27/35

Towards a Counterexample Observations from 3D 28/35

Towards a Counterexample 28/35

Towards a Counterexample 28/35

4D case Counterexample Data around two great circles: [cos θ, sin θ, 0, 0] T [ 0, 0, cos θ, sin θ] T X ± = X C1 X C2 X C1 : [cos kπ m, sin kπ m, ± δ, ± δ]t X C2 : [ ± δ, ± δ, cos kπ m, sin kπ m ]T 29/35

4D case Counterexample (a) δ = + ɛ (b) δ = + ɛ Disconnected for: δ (, 2 2 ) = f(m) (c) δ = 2 2 ɛ (d) δ = 2 2 + ɛ Figure: Orthographic projection to 3D 30/35

Is Connectivity Generic? In the counterexample ray(x i ) hits the interior of the facet. 31/35

Is Connectivity Generic? In the counterexample ray(x i ) hits the interior of the facet. 31/35

Conclusion The importance of Subspace Clustering 32/35

Conclusion The importance of Subspace Clustering Advantages of Sparse Subspace Clustering 32/35

Conclusion The importance of Subspace Clustering Advantages of Sparse Subspace Clustering For d = 2 and d = 3 it will not fail, 32/35

Conclusion The importance of Subspace Clustering Advantages of Sparse Subspace Clustering For d = 2 and d = 3 it will not fail, Caution must be taken for d 4, A post processing stage, etc. 32/35

Thanks 33/35

Questions????? 34/35

35/35