A Geometric Analysis of Subspace Clustering with Outliers

Size: px
Start display at page:

Download "A Geometric Analysis of Subspace Clustering with Outliers"

Transcription

1 A Geometric Analysis of Subspace Clustering with Outliers Mahdi Soltanolkotabi and Emmanuel Candés Stanford University

2 Fundamental Tool in Data Mining : PCA

3 Fundamental Tool in Data Mining : PCA

4 Subspace Clustering : multi-subspace PCA

5 Subspace Clustering : multi-subspace PCA

6 Subspace Clustering : multi-subspace PCA

7 Subspace Clustering : multi-subspace PCA

8 Subspace Clustering : multi-subspace PCA

9 Index 1 Motivation 2 Problem Formulation 3 Methods 4 Deterministic Model 5 Semi-random Model 6 Fully-random Model 7 With outliers

10 Motivation 1 Motivation 2 Problem Formulation 3 Methods 4 Deterministic Model 5 Semi-random Model 6 Fully-random Model 7 With outliers

11 Motivation Computer Vision Many applications. Figures from various papers/talks. I refer to the interesting lecture by Prof. Ma tomorrow which highlights the relevance of low dimensional structures in computer vision. Just add multiple Tuesday, January 10, 2012 categories :)

12 Motivation Metabolic Screening Goal Detect metabolic disease as early as possible in new borns. concentration of metabolites are tested. Usually, each metabolic disease causes a correlation between the concentration of a specific set of metabolites.

13 Problem Formulation 1 Motivation 2 Problem Formulation 3 Methods 4 Deterministic Model 5 Semi-random Model 6 Fully-random Model 7 With outliers

14 Problem Formulation Problem Formulation N unit norm points in R n on a union of unknown linear subspaces S 1 S 2... S L of unknown dimensions d 1, d 2,..., d L. Also, N 0 uniform outliers. outliers inliers Goal : Without any prior knowledge about the number of subspaces, their orientation or their dimension, 1) identify all the outliers, and 2) segment or assign each data point to a cluster as to recover all the hidden subspaces.

15 Methods 1 Motivation 2 Problem Formulation 3 Methods 4 Deterministic Model 5 Semi-random Model 6 Fully-random Model 7 With outliers

16 Methods Sparse Subspace Clustering Regress one column against other columns. for i = 1,..., N solve min z i l1 subject to x i = Xz i, z ii = 0. =

17 Methods l 1 subspace detection property Definition (l 1 Subspace Detection Property) The subspaces {S l } L l=1 and points X obey the l 1 subspace detection property if and only if it holds that for all i, the optimal solution has nonzero entries only when the corresponding columns of X are in the same subspace as x i.

18 Methods SSC Algorithm Algorithm 1 Sparse Subspace Clustering (SSC) Input: A data set X arranged as columns of X R n N. 1. Solve (the optimization variable is the N N matrix Z) minimize subject to Z 1 XZ = X diag(z) = Form the affinity graph G with nodes representing the N data points and edge weights given by W = Z + Z T. 3. Sort the eigenvalues σ 1 σ 2... σ N of the normalized Laplacian of G in descending order, and set ˆL = N arg max σ i σ i+1. i=1,...,n 1 4. Apply a spectral clustering technique to the affinity graph using ˆL. Output: Partition X 1,...,X ˆL. Subspaces of nearly linear dimension. We prove that in generic settings, SSC can effectively cluster the data even when the dimensions of the subspaces grow almost linearly with the ambient dimension. We are not aware of other literature explaining why this should

19 Methods History of subspace problems and use of l 1 regression Various subspace clustering algorithms proposed by Profs. Ma, Sastry, Vidal, Wright and collaborators.

20 Methods History of subspace problems and use of l 1 regression Various subspace clustering algorithms proposed by Profs. Ma, Sastry, Vidal, Wright and collaborators. Sparse representation in the multiple subspace framework initiated by Prof. Yi Ma and collaborators in the context of face recognition.

21 Methods History of subspace problems and use of l 1 regression Various subspace clustering algorithms proposed by Profs. Ma, Sastry, Vidal, Wright and collaborators. Sparse representation in the multiple subspace framework initiated by Prof. Yi Ma and collaborators in the context of face recognition. Different from subspace clustering because you know the subspaces in advance.

22 Methods History of subspace problems and use of l 1 regression Various subspace clustering algorithms proposed by Profs. Ma, Sastry, Vidal, Wright and collaborators. Sparse representation in the multiple subspace framework initiated by Prof. Yi Ma and collaborators in the context of face recognition. Different from subspace clustering because you know the subspaces in advance. For subspace clustering by Elhamifar and Vidal. Coined the name Sparse Subspace Clustering (SSC).

23 Methods Our goal Prior work restrictive, essentially say that the subspaces must be almost independent. Can not fully explain why SSC works so well in practice. Limited theoretical guarantees, Huge gap between theory and practice.

24 Methods Our goal Prior work restrictive, essentially say that the subspaces must be almost independent. Can not fully explain why SSC works so well in practice. Limited theoretical guarantees, Huge gap between theory and practice. In direct analogy with l 1 regression results before compressed sensing results in 2004.

25 Methods Our goal Prior work restrictive, essentially say that the subspaces must be almost independent. Can not fully explain why SSC works so well in practice. Limited theoretical guarantees, Huge gap between theory and practice. In direct analogy with l 1 regression results before compressed sensing results in Our contribution is to bridge the gap between theory and practice, specifically :

26 Methods Our goal Prior work restrictive, essentially say that the subspaces must be almost independent. Can not fully explain why SSC works so well in practice. Limited theoretical guarantees, Huge gap between theory and practice. In direct analogy with l 1 regression results before compressed sensing results in Our contribution is to bridge the gap between theory and practice, specifically : Intersecting Subspaces.

27 Methods Our goal Prior work restrictive, essentially say that the subspaces must be almost independent. Can not fully explain why SSC works so well in practice. Limited theoretical guarantees, Huge gap between theory and practice. In direct analogy with l 1 regression results before compressed sensing results in Our contribution is to bridge the gap between theory and practice, specifically : Intersecting Subspaces. Subspaces of nearly linear dimension.

28 Methods Our goal Prior work restrictive, essentially say that the subspaces must be almost independent. Can not fully explain why SSC works so well in practice. Limited theoretical guarantees, Huge gap between theory and practice. In direct analogy with l 1 regression results before compressed sensing results in Our contribution is to bridge the gap between theory and practice, specifically : Intersecting Subspaces. Subspaces of nearly linear dimension. Outliers.

29 Methods Our goal Prior work restrictive, essentially say that the subspaces must be almost independent. Can not fully explain why SSC works so well in practice. Limited theoretical guarantees, Huge gap between theory and practice. In direct analogy with l 1 regression results before compressed sensing results in Our contribution is to bridge the gap between theory and practice, specifically : Intersecting Subspaces. Subspaces of nearly linear dimension. Outliers.

30 Methods Models We will consider three different models regarding the orientation of subspaces and distribution of points on each subspace. Orientation distribution Deterministic Semi-random Fully-random : Random : Deterministic

31 Deterministic Model 1 Motivation 2 Problem Formulation 3 Methods 4 Deterministic Model 5 Semi-random Model 6 Fully-random Model 7 With outliers

32 Deterministic Model Deterministic Model Theorem (M. Soltanolkotabi, E. J. Candes) If µ(x l ) < min i: x i X l r(p l i) (1) for each l = 1,..., L, then the subspace detection property holds. If (1) holds for a given l, then a local subspace detection property holds in the sense that for all x i X l, the solution to the optimization problem has nonzero entries only when the corresponding columns of X are in the same subspace as x i. As long as the point sets are not very similar and the points on each subspace are well distributed SSC works.

33 Deterministic Model Geometric view of l 1 subspace detection

34 Deterministic Model Geometric view on the condition of the theorem

35 Semi-random Model 1 Motivation 2 Problem Formulation 3 Methods 4 Deterministic Model 5 Semi-random Model 6 Fully-random Model 7 With outliers

36 Semi-random Model principal angle and affinity between subspaces Definition The principal angles θ (1) k,l,..., θ(d k dl } k,l between two subspaces S k and S l of dimensions d k and d l, are recursively defined by cos(θ (i) kl ) = max max y S k z S l y T z y l2 z l2 = y T i z i y i l2 z i l2. with the orthogonality constraints y T y j = 0, z T z j = 0, j = 1,..., i 1. Definition The affinity between two subspaces is defined by aff(s k, S l ) = cos 2 θ (1) kl cos2 θ (d k dl ) kl. For random subspaces square of affinity is the Pillai-Bartlett test Statistics.

37 Semi-random Model Semi-random model Assume we have N l = ρ l d l + 1 random points on subspace S l. Theorem (M. Soltanolkotabi, E. J. Candes) Ignoring some log factors, as long as max k : k l aff(s k, S l ) dk < c log ρ l, for each l, (2) the subspace detection property holds with high probability. The affinity can at most be d k and, therefore, our result essentially states that if the affinity is less than c d k, then SSC works. allows for intersection of subspaces.

38 Semi-random Model Four error criterion feature detection error : For each point x i in subspace S li partition the optimal solution of SSC (z i ) in the form we z i1 z i = Γ z i2. z i3 1 N N i=1 (1 z ik i l1 z i l1 ). clustering error : # of misclassified points. total # of points

39 Semi-random Model Four error criterion error in estimating the number of subspaces : This error quantity indicates whether our version of the SSC algorithm can correctly identify the correct number of subspaces ; 0 when it succeeds, 1 when it fails. Singular values of the normalized Laplacian : σ N L < ɛ.

40 Semi-random Model affinity example Choose difficult case of n = 2d, the three basis we choose for each subspace are [ ] [ ] U (1) Id =, U (2) 0d d =, U (3) = 0 d d I d 1 cos(θ 1 ) cos(θ 2 ) cos(θ 3 ) cos(θ d ) sin(θ 1 ) sin(θ 2 ) sin(θ 3 ) sin(θ d ) cosθi α 0 1 i d

41 Semi-random Model feature detection error aff/ d ρ 0

42 Semi-random Model clustering error aff/ d ρ 0

43 Semi-random Model error in estimating the number of subspaces aff/ d ρ 0

44 Fully-random Model 1 Motivation 2 Problem Formulation 3 Methods 4 Deterministic Model 5 Semi-random Model 6 Fully-random Model 7 With outliers

45 Fully-random Model Fully-random model L subspaces, each of dimension d, ρd + 1 points on each subspace. Theorem (M. Soltanolkotabi, E. J. Candés) The subspace detection property holds with high probability as long as where c > c2 (ρ) log ρ 12 log N. d < c n,

46 Fully-random Model Fully-random model L subspaces, each of dimension d, ρd + 1 points on each subspace. Theorem (M. Soltanolkotabi, E. J. Candés) The subspace detection property holds with high probability as long as where c > c2 (ρ) log ρ 12 log N. d < c n, allows for the dimension of the subspaces d to grow almost linearly with the ambient dimension n.

47 Fully-random Model Fully-random model L subspaces, each of dimension d, ρd + 1 points on each subspace. Theorem (M. Soltanolkotabi, E. J. Candés) The subspace detection property holds with high probability as long as where c > c2 (ρ) log ρ 12 log N. d < c n, allows for the dimension of the subspaces d to grow almost linearly with the ambient dimension n. tend n, with d n a constant, ρ = dη with η > 0 the condition reduces to 1 d < 48(η + 1) n

48 Fully-random Model Comparison with previous analysis Condition of Elhamifar and Vidal max cos(θ kl) < 1 k: k l dl max σ min (Y ) for all l = 1,..., L, Y W dl (X (l) ) semi-random model : -Elhamifar and Vidal s condition reduces to cos θ kl < 1 d. -We require aff(s k, S l ) = cos 2 (θ (1) kl ) + cos2 (θ (2) kl ) cos2 (θ (d) kl ) < c d. The left-hand side can be much smaller than d cos θ kl and is, therefore, less restrictive, e.g. cos θ kl < c is sufficient.

49 Fully-random Model Comparison with previous analysis contd. Intersecting subspaces : Two subspaces with an intersection of dimension s. -Elhamifar and Vidal becomes 1 < 1 d, which cannot hold. -Our condition simplifies to cos 2 (θ (s+1) kl fully random model : -Elhamifar and Vidal : d < c 1 n. - We essentially require : d < cn. ) cos 2 (θ (d) kl ) < cd s,

50 Fully-random Model Subspaces Intersect s =1 s =2 s =3 s =4

51 Fully-random Model feature detection error s

52 Fully-random Model clustering error s

53 Fully-random Model error in estimating the number of subspaces s

54 With outliers 1 Motivation 2 Problem Formulation 3 Methods 4 Deterministic Model 5 Semi-random Model 6 Fully-random Model 7 With outliers

55 With outliers subspace clustering with outliers Assume the outliers are uniform at random on the unit sphere. Again, regress one column against other columns. for i = 1,..., N solve min z i l1 subject to x i = Xz i, z ii = 0. consider optimal values z i l 1. Inliers : Intuitively, size of support d, z i l 1 d.

56 With outliers subspace clustering with outliers Assume the outliers are uniform at random on the unit sphere. Again, regress one column against other columns. for i = 1,..., N solve min z i l1 subject to x i = Xz i, z ii = 0. consider optimal values z i l 1. Inliers : Intuitively, size of support d, z i l 1 d. Outliers : Intuitively, size of support n, z i l 1 n.

57 With outliers subspace clustering with outliers Assume the outliers are uniform at random on the unit sphere. Again, regress one column against other columns. for i = 1,..., N solve min z i l1 subject to x i = Xz i, z ii = 0. consider optimal values z i l 1. Inliers : Intuitively, size of support d, z i l 1 d. Outliers : Intuitively, size of support n, z i l 1 n. Therefore, we should expect a gap.

58 With outliers Inliers : Intuitively, size of support d, z i l 1 d. Here, z i l 1 = 1.05.

59 With outliers Inliers : Intuitively, size of support d, z i l 1 d. Here, z i l 1 = Outliers : Intuitively, size of support n, z i l 1 n. Here, z i l 1 = 1.72.

60 With outliers Algorithm for subspace clustering with outliers [M. Soltanolktoabi, E. J. Candes] Algorithm 2 Subspace clustering in the presence of outliers Input: A data set X arranged as columns of X Rn N. 1. Solve minimize Z 1 subject to XZ = X diag(z) = For each i {1,..., N }, declare i to be an outlier iff zi 1 > λ(γ) n.3 3. Apply a subspace clustering to the remaining points. Output: Partition X0, X1,..., XL. Theorem 1.3 Assume there are Nd points to be clustered together with N0 outliers sampled unin 1 on the n 1-dimensional unit sphere (N = N0 + N ). Algorithm 2 detects all of formly d 2 γ at =random n and λ is a threshold ratio function. the outliers with high probability4 as long as N0 < 1 c n e Nd, n where c is a numerical constant. Furthermore, suppose the subspaces are d-dimensional and of arbitrary orientation, and that each contains ρd + 1 points sampled independently and uniformly

61 With outliers Threshold function λ(γ) = 2 π 2 πe 1 γ, if 1 γ e, 1 log γ, if γ e, λ(γ) γ

62 With outliers with outliers Theorem (M. Soltanolkotabi, E. J. Candés) With outlier points uniformly random using the threshold value (1 t) λ(γ) e n, all outliers are identified correctly with high probability. Furthermore, we have the following guarantees in the deterministic and semi-random models. (a) If in the deterministic model, max l,i 1 < (1 t)λ(γ) r(p(x (l) i )) then no real data point is wrongfully detected as an outlier. e n, (3) (b) If in the semi-random model, 2dl max l c(ρ l ) < (1 t) λ(γ) n, (4) log ρ l e then w.h.p. no real data point is wrongfully detected as an outlier.

63 With outliers Comparison with previous analysis our results restricts the number of outliers to : Lerman and Zhang use N 0 < min{ρ c 2n/d, 1 n ec n } N d e lp (X, S 1,..., S L ) = x X ( min dist(x, Sl ) ) p. 1 l L with 0 p 1. They bound on the number of outliers in the semi-random model : N 0 < τ 0 ρd min ( 1, min k l dist(s k, S l ) p /2 p). which is upper bounded by ρd, the typical number of points per subspace. non-convex, no practical method known for solving it.

64 With outliers Segmentation with outliers 8 7 l 1 norm values Proven Threshold Conjuctured Threshold Figure: d = 5, n = 200, ρ = 5, L = 80

65 With outliers References List is not comprehensive! A geometric analysis of subspace clustering. M. Soltanolkotabi and E. J. Candes to appear in Annals of Statistics. Noisy case, in preparation. Joint with E. J. Candes, E. Elhamifar, and R. Vidal. Sparse Subspace Clustering. E. Elhamifar and R. Vidal. Subspace Clustering. Turorial by R. Vidal. Prof. Yi Ma s website for papers on related topics :

66 With outliers Conclusion Prior work restrictive, essentially say that the subspaces must be almost independent. Can not fully explain why SSC works so well in practice.

67 With outliers Conclusion Prior work restrictive, essentially say that the subspaces must be almost independent. Can not fully explain why SSC works so well in practice. In direct analogy with l 1 regression results before compressed sensing results in 2004.

68 With outliers Conclusion Prior work restrictive, essentially say that the subspaces must be almost independent. Can not fully explain why SSC works so well in practice. In direct analogy with l 1 regression results before compressed sensing results in Our contributions is to bridge the gap between theory and practice, specifically :

69 With outliers Conclusion Prior work restrictive, essentially say that the subspaces must be almost independent. Can not fully explain why SSC works so well in practice. In direct analogy with l 1 regression results before compressed sensing results in Our contributions is to bridge the gap between theory and practice, specifically : Intersecting Subspaces.

70 With outliers Conclusion Prior work restrictive, essentially say that the subspaces must be almost independent. Can not fully explain why SSC works so well in practice. In direct analogy with l 1 regression results before compressed sensing results in Our contributions is to bridge the gap between theory and practice, specifically : Intersecting Subspaces. Subspaces of nearly linear dimension.

71 With outliers Conclusion Prior work restrictive, essentially say that the subspaces must be almost independent. Can not fully explain why SSC works so well in practice. In direct analogy with l 1 regression results before compressed sensing results in Our contributions is to bridge the gap between theory and practice, specifically : Intersecting Subspaces. Subspaces of nearly linear dimension. Outliers.

72 With outliers Conclusion Prior work restrictive, essentially say that the subspaces must be almost independent. Can not fully explain why SSC works so well in practice. In direct analogy with l 1 regression results before compressed sensing results in Our contributions is to bridge the gap between theory and practice, specifically : Intersecting Subspaces. Subspaces of nearly linear dimension. Outliers. Geometric Insights.

73 Thank you With outliers

DD2429 Computational Photography :00-19:00

DD2429 Computational Photography :00-19:00 . Examination: DD2429 Computational Photography 202-0-8 4:00-9:00 Each problem gives max 5 points. In order to pass you need about 0-5 points. You are allowed to use the lecture notes and standard list

More information

Robust Face Recognition via Sparse Representation Authors: John Wright, Allen Y. Yang, Arvind Ganesh, S. Shankar Sastry, and Yi Ma

Robust Face Recognition via Sparse Representation Authors: John Wright, Allen Y. Yang, Arvind Ganesh, S. Shankar Sastry, and Yi Ma Robust Face Recognition via Sparse Representation Authors: John Wright, Allen Y. Yang, Arvind Ganesh, S. Shankar Sastry, and Yi Ma Presented by Hu Han Jan. 30 2014 For CSE 902 by Prof. Anil K. Jain: Selected

More information

Advanced Operations Research Techniques IE316. Quiz 1 Review. Dr. Ted Ralphs

Advanced Operations Research Techniques IE316. Quiz 1 Review. Dr. Ted Ralphs Advanced Operations Research Techniques IE316 Quiz 1 Review Dr. Ted Ralphs IE316 Quiz 1 Review 1 Reading for The Quiz Material covered in detail in lecture. 1.1, 1.4, 2.1-2.6, 3.1-3.3, 3.5 Background material

More information

Subspace Clustering. Weiwei Feng. December 11, 2015

Subspace Clustering. Weiwei Feng. December 11, 2015 Subspace Clustering Weiwei Feng December 11, 2015 Abstract Data structure analysis is an important basis of machine learning and data science, which is now widely used in computational visualization problems,

More information

Graph Connectivity in Sparse Subspace Clustering

Graph Connectivity in Sparse Subspace Clustering www.nicta.com.au From imagination to impact Graph Connectivity in Sparse Subspace Clustering Behrooz Nasihatkon 24 March 2011 Outline 1 Subspace Clustering 2 Sparse Subspace Clustering 3 Graph Connectivity

More information

arxiv: v1 [cs.cv] 9 Sep 2013

arxiv: v1 [cs.cv] 9 Sep 2013 Learning Transformations for Clustering and Classification arxiv:139.274v1 [cs.cv] 9 Sep 213 Qiang Qiu Department of Electrical and Computer Engineering Duke University Durham, NC 2778, USA Guillermo Sapiro

More information

Generalized Principal Component Analysis CVPR 2007

Generalized Principal Component Analysis CVPR 2007 Generalized Principal Component Analysis Tutorial @ CVPR 2007 Yi Ma ECE Department University of Illinois Urbana Champaign René Vidal Center for Imaging Science Institute for Computational Medicine Johns

More information

Math 734 Aug 22, Differential Geometry Fall 2002, USC

Math 734 Aug 22, Differential Geometry Fall 2002, USC Math 734 Aug 22, 2002 1 Differential Geometry Fall 2002, USC Lecture Notes 1 1 Topological Manifolds The basic objects of study in this class are manifolds. Roughly speaking, these are objects which locally

More information

10-701/15-781, Fall 2006, Final

10-701/15-781, Fall 2006, Final -7/-78, Fall 6, Final Dec, :pm-8:pm There are 9 questions in this exam ( pages including this cover sheet). If you need more room to work out your answer to a question, use the back of the page and clearly

More information

Robust Face Recognition via Sparse Representation

Robust Face Recognition via Sparse Representation Robust Face Recognition via Sparse Representation Panqu Wang Department of Electrical and Computer Engineering University of California, San Diego La Jolla, CA 92092 pawang@ucsd.edu Can Xu Department of

More information

Lecture 9 Fitting and Matching

Lecture 9 Fitting and Matching Lecture 9 Fitting and Matching Problem formulation Least square methods RANSAC Hough transforms Multi- model fitting Fitting helps matching! Reading: [HZ] Chapter: 4 Estimation 2D projective transformation

More information

Lecture 8 Fitting and Matching

Lecture 8 Fitting and Matching Lecture 8 Fitting and Matching Problem formulation Least square methods RANSAC Hough transforms Multi-model fitting Fitting helps matching! Reading: [HZ] Chapter: 4 Estimation 2D projective transformation

More information

Low Rank Representation Theories, Algorithms, Applications. 林宙辰 北京大学 May 3, 2012

Low Rank Representation Theories, Algorithms, Applications. 林宙辰 北京大学 May 3, 2012 Low Rank Representation Theories, Algorithms, Applications 林宙辰 北京大学 May 3, 2012 Outline Low Rank Representation Some Theoretical Analysis Solution by LADM Applications Generalizations Conclusions Sparse

More information

Detecting Burnscar from Hyperspectral Imagery via Sparse Representation with Low-Rank Interference

Detecting Burnscar from Hyperspectral Imagery via Sparse Representation with Low-Rank Interference Detecting Burnscar from Hyperspectral Imagery via Sparse Representation with Low-Rank Interference Minh Dao 1, Xiang Xiang 1, Bulent Ayhan 2, Chiman Kwan 2, Trac D. Tran 1 Johns Hopkins Univeristy, 3400

More information

Sparsity Based Regularization

Sparsity Based Regularization 9.520: Statistical Learning Theory and Applications March 8th, 200 Sparsity Based Regularization Lecturer: Lorenzo Rosasco Scribe: Ioannis Gkioulekas Introduction In previous lectures, we saw how regularization

More information

Graph Connectivity in Noisy Sparse Subspace Clustering

Graph Connectivity in Noisy Sparse Subspace Clustering Yining Wang, Yu-Xiang Wang and Aarti Singh Machine Learning Department, School of Computer Science, Carnegie Mellon University Abstract Subspace clustering is the problem of clustering data points into

More information

ELEG Compressive Sensing and Sparse Signal Representations

ELEG Compressive Sensing and Sparse Signal Representations ELEG 867 - Compressive Sensing and Sparse Signal Representations Introduction to Matrix Completion and Robust PCA Gonzalo Garateguy Depart. of Electrical and Computer Engineering University of Delaware

More information

Outline. Advanced Digital Image Processing and Others. Importance of Segmentation (Cont.) Importance of Segmentation

Outline. Advanced Digital Image Processing and Others. Importance of Segmentation (Cont.) Importance of Segmentation Advanced Digital Image Processing and Others Xiaojun Qi -- REU Site Program in CVIP (7 Summer) Outline Segmentation Strategies and Data Structures Algorithms Overview K-Means Algorithm Hidden Markov Model

More information

Subspace Clustering with Global Dimension Minimization And Application to Motion Segmentation

Subspace Clustering with Global Dimension Minimization And Application to Motion Segmentation Subspace Clustering with Global Dimension Minimization And Application to Motion Segmentation Bryan Poling University of Minnesota Joint work with Gilad Lerman University of Minnesota The Problem of Subspace

More information

Learning a Manifold as an Atlas Supplementary Material

Learning a Manifold as an Atlas Supplementary Material Learning a Manifold as an Atlas Supplementary Material Nikolaos Pitelis Chris Russell School of EECS, Queen Mary, University of London [nikolaos.pitelis,chrisr,lourdes]@eecs.qmul.ac.uk Lourdes Agapito

More information

EECS 442 Computer vision. Fitting methods

EECS 442 Computer vision. Fitting methods EECS 442 Computer vision Fitting methods - Problem formulation - Least square methods - RANSAC - Hough transforms - Multi-model fitting - Fitting helps matching! Reading: [HZ] Chapters: 4, 11 [FP] Chapters:

More information

THE key to graph-based learning algorithms is the sparse. Constructing the L2-Graph for Robust Subspace Learning and Subspace Clustering

THE key to graph-based learning algorithms is the sparse. Constructing the L2-Graph for Robust Subspace Learning and Subspace Clustering 1 Constructing the L2-Graph for Robust Subspace Learning and Subspace Clustering Xi Peng, Zhiding Yu, Zhang Yi, Fellow, IEEE and Huajin Tang, Member, IEEE, Abstract Under the framework of graph-based learning,

More information

Fast Approximation of Matrix Coherence and Statistical Leverage

Fast Approximation of Matrix Coherence and Statistical Leverage Fast Approximation of Matrix Coherence and Statistical Leverage Michael W. Mahoney Stanford University ( For more info, see: http:// cs.stanford.edu/people/mmahoney/ or Google on Michael Mahoney ) Statistical

More information

arxiv: v1 [cs.cv] 12 Apr 2017

arxiv: v1 [cs.cv] 12 Apr 2017 Provable Self-Representation Based Outlier Detection in a Union of Subspaces Chong You, Daniel P. Robinson, René Vidal Johns Hopkins University, Baltimore, MD, 21218, USA arxiv:1704.03925v1 [cs.cv] 12

More information

Fitting. Instructor: Jason Corso (jjcorso)! web.eecs.umich.edu/~jjcorso/t/598f14!! EECS Fall 2014! Foundations of Computer Vision!

Fitting. Instructor: Jason Corso (jjcorso)! web.eecs.umich.edu/~jjcorso/t/598f14!! EECS Fall 2014! Foundations of Computer Vision! Fitting EECS 598-08 Fall 2014! Foundations of Computer Vision!! Instructor: Jason Corso (jjcorso)! web.eecs.umich.edu/~jjcorso/t/598f14!! Readings: FP 10; SZ 4.3, 5.1! Date: 10/8/14!! Materials on these

More information

A Course in Machine Learning

A Course in Machine Learning A Course in Machine Learning Hal Daumé III 13 UNSUPERVISED LEARNING If you have access to labeled training data, you know what to do. This is the supervised setting, in which you have a teacher telling

More information

Part I. Hierarchical clustering. Hierarchical Clustering. Hierarchical clustering. Produces a set of nested clusters organized as a

Part I. Hierarchical clustering. Hierarchical Clustering. Hierarchical clustering. Produces a set of nested clusters organized as a Week 9 Based in part on slides from textbook, slides of Susan Holmes Part I December 2, 2012 Hierarchical Clustering 1 / 1 Produces a set of nested clusters organized as a Hierarchical hierarchical clustering

More information

CS 664 Segmentation. Daniel Huttenlocher

CS 664 Segmentation. Daniel Huttenlocher CS 664 Segmentation Daniel Huttenlocher Grouping Perceptual Organization Structural relationships between tokens Parallelism, symmetry, alignment Similarity of token properties Often strong psychophysical

More information

Intersecting Manifolds: Detection, Segmentation, and Labeling

Intersecting Manifolds: Detection, Segmentation, and Labeling Proceedings of the Twenty-Fourth International Joint Conference on Artificial Intelligence (IJCAI 21) Intersecting Manifolds: Detection, Segmentation, and Labeling Shay Deutsch and Gérard Medioni University

More information

Introduction to Topics in Machine Learning

Introduction to Topics in Machine Learning Introduction to Topics in Machine Learning Namrata Vaswani Department of Electrical and Computer Engineering Iowa State University Namrata Vaswani 1/ 27 Compressed Sensing / Sparse Recovery: Given y :=

More information

Outlier Pursuit: Robust PCA and Collaborative Filtering

Outlier Pursuit: Robust PCA and Collaborative Filtering Outlier Pursuit: Robust PCA and Collaborative Filtering Huan Xu Dept. of Mechanical Engineering & Dept. of Mathematics National University of Singapore Joint w/ Constantine Caramanis, Yudong Chen, Sujay

More information

Section 5 Convex Optimisation 1. W. Dai (IC) EE4.66 Data Proc. Convex Optimisation page 5-1

Section 5 Convex Optimisation 1. W. Dai (IC) EE4.66 Data Proc. Convex Optimisation page 5-1 Section 5 Convex Optimisation 1 W. Dai (IC) EE4.66 Data Proc. Convex Optimisation 1 2018 page 5-1 Convex Combination Denition 5.1 A convex combination is a linear combination of points where all coecients

More information

Data preprocessing Functional Programming and Intelligent Algorithms

Data preprocessing Functional Programming and Intelligent Algorithms Data preprocessing Functional Programming and Intelligent Algorithms Que Tran Høgskolen i Ålesund 20th March 2017 1 Why data preprocessing? Real-world data tend to be dirty incomplete: lacking attribute

More information

Latent Space Sparse Subspace Clustering

Latent Space Sparse Subspace Clustering Latent Space Sparse Subspace Clustering Vishal M. Patel, Hien Van Nguyen Center for Automation Research, UMIACS UMD, College Park, MD 20742 {pvishalm,hien}@umiacs.umd.edu René Vidal Center for Imaging

More information

Outline Introduction Problem Formulation Proposed Solution Applications Conclusion. Compressed Sensing. David L Donoho Presented by: Nitesh Shroff

Outline Introduction Problem Formulation Proposed Solution Applications Conclusion. Compressed Sensing. David L Donoho Presented by: Nitesh Shroff Compressed Sensing David L Donoho Presented by: Nitesh Shroff University of Maryland Outline 1 Introduction Compressed Sensing 2 Problem Formulation Sparse Signal Problem Statement 3 Proposed Solution

More information

The convex geometry of inverse problems

The convex geometry of inverse problems The convex geometry of inverse problems Benjamin Recht Department of Computer Sciences University of Wisconsin-Madison Joint work with Venkat Chandrasekaran Pablo Parrilo Alan Willsky Linear Inverse Problems

More information

Learning Low-rank Transformations: Algorithms and Applications. Qiang Qiu Guillermo Sapiro

Learning Low-rank Transformations: Algorithms and Applications. Qiang Qiu Guillermo Sapiro Learning Low-rank Transformations: Algorithms and Applications Qiang Qiu Guillermo Sapiro Motivation Outline Low-rank transform - algorithms and theories Applications Subspace clustering Classification

More information

Introduction to Modern Control Systems

Introduction to Modern Control Systems Introduction to Modern Control Systems Convex Optimization, Duality and Linear Matrix Inequalities Kostas Margellos University of Oxford AIMS CDT 2016-17 Introduction to Modern Control Systems November

More information

Week 5. Convex Optimization

Week 5. Convex Optimization Week 5. Convex Optimization Lecturer: Prof. Santosh Vempala Scribe: Xin Wang, Zihao Li Feb. 9 and, 206 Week 5. Convex Optimization. The convex optimization formulation A general optimization problem is

More information

Supplementary Material : Partial Sum Minimization of Singular Values in RPCA for Low-Level Vision

Supplementary Material : Partial Sum Minimization of Singular Values in RPCA for Low-Level Vision Supplementary Material : Partial Sum Minimization of Singular Values in RPCA for Low-Level Vision Due to space limitation in the main paper, we present additional experimental results in this supplementary

More information

IN some applications, we need to decompose a given matrix

IN some applications, we need to decompose a given matrix 1 Recovery of Sparse and Low Rank Components of Matrices Using Iterative Method with Adaptive Thresholding Nematollah Zarmehi, Student Member, IEEE and Farokh Marvasti, Senior Member, IEEE arxiv:1703.03722v2

More information

1 Proximity via Graph Spanners

1 Proximity via Graph Spanners CS273: Algorithms for Structure Handout # 11 and Motion in Biology Stanford University Tuesday, 4 May 2003 Lecture #11: 4 May 2004 Topics: Proximity via Graph Spanners Geometric Models of Molecules, I

More information

Face Recognition via Sparse Representation

Face Recognition via Sparse Representation Face Recognition via Sparse Representation John Wright, Allen Y. Yang, Arvind, S. Shankar Sastry and Yi Ma IEEE Trans. PAMI, March 2008 Research About Face Face Detection Face Alignment Face Recognition

More information

Image Processing. Image Features

Image Processing. Image Features Image Processing Image Features Preliminaries 2 What are Image Features? Anything. What they are used for? Some statements about image fragments (patches) recognition Search for similar patches matching

More information

A fast algorithm for sparse reconstruction based on shrinkage, subspace optimization and continuation [Wen,Yin,Goldfarb,Zhang 2009]

A fast algorithm for sparse reconstruction based on shrinkage, subspace optimization and continuation [Wen,Yin,Goldfarb,Zhang 2009] A fast algorithm for sparse reconstruction based on shrinkage, subspace optimization and continuation [Wen,Yin,Goldfarb,Zhang 2009] Yongjia Song University of Wisconsin-Madison April 22, 2010 Yongjia Song

More information

Chapter 6. Curves and Surfaces. 6.1 Graphs as Surfaces

Chapter 6. Curves and Surfaces. 6.1 Graphs as Surfaces Chapter 6 Curves and Surfaces In Chapter 2 a plane is defined as the zero set of a linear function in R 3. It is expected a surface is the zero set of a differentiable function in R n. To motivate, graphs

More information

Sparse Reconstruction / Compressive Sensing

Sparse Reconstruction / Compressive Sensing Sparse Reconstruction / Compressive Sensing Namrata Vaswani Department of Electrical and Computer Engineering Iowa State University Namrata Vaswani Sparse Reconstruction / Compressive Sensing 1/ 20 The

More information

arxiv: v1 [cs.cv] 27 Apr 2014

arxiv: v1 [cs.cv] 27 Apr 2014 Robust and Efficient Subspace Segmentation via Least Squares Regression Can-Yi Lu, Hai Min, Zhong-Qiu Zhao, Lin Zhu, De-Shuang Huang, and Shuicheng Yan arxiv:1404.6736v1 [cs.cv] 27 Apr 2014 Department

More information

CS 664 Slides #11 Image Segmentation. Prof. Dan Huttenlocher Fall 2003

CS 664 Slides #11 Image Segmentation. Prof. Dan Huttenlocher Fall 2003 CS 664 Slides #11 Image Segmentation Prof. Dan Huttenlocher Fall 2003 Image Segmentation Find regions of image that are coherent Dual of edge detection Regions vs. boundaries Related to clustering problems

More information

Divide and Conquer Kernel Ridge Regression

Divide and Conquer Kernel Ridge Regression Divide and Conquer Kernel Ridge Regression Yuchen Zhang John Duchi Martin Wainwright University of California, Berkeley COLT 2013 Yuchen Zhang (UC Berkeley) Divide and Conquer KRR COLT 2013 1 / 15 Problem

More information

Motion Segmentation by SCC on the Hopkins 155 Database

Motion Segmentation by SCC on the Hopkins 155 Database Motion Segmentation by SCC on the Hopkins 155 Database Guangliang Chen, Gilad Lerman School of Mathematics, University of Minnesota 127 Vincent Hall, 206 Church Street SE, Minneapolis, MN 55455 glchen@math.umn.edu,

More information

Structurally Random Matrices

Structurally Random Matrices Fast Compressive Sampling Using Structurally Random Matrices Presented by: Thong Do (thongdo@jhu.edu) The Johns Hopkins University A joint work with Prof. Trac Tran, The Johns Hopkins University it Dr.

More information

Lecture 9. Semidefinite programming is linear programming where variables are entries in a positive semidefinite matrix.

Lecture 9. Semidefinite programming is linear programming where variables are entries in a positive semidefinite matrix. CSE525: Randomized Algorithms and Probabilistic Analysis Lecture 9 Lecturer: Anna Karlin Scribe: Sonya Alexandrova and Keith Jia 1 Introduction to semidefinite programming Semidefinite programming is linear

More information

A Geometric Perspective on Data Analysis

A Geometric Perspective on Data Analysis A Geometric Perspective on Data Analysis Partha Niyogi The University of Chicago Collaborators: M. Belkin, X. He, A. Jansen, H. Narayanan, V. Sindhwani, S. Smale, S. Weinberger A Geometric Perspectiveon

More information

Principal Coordinate Clustering

Principal Coordinate Clustering Principal Coordinate Clustering Ali Sekmen, Akram Aldroubi, Ahmet Bugra Koku, Keaton Hamm Department of Computer Science, Tennessee State University Department of Mathematics, Vanderbilt University Department

More information

Hyperspectral Data Classification via Sparse Representation in Homotopy

Hyperspectral Data Classification via Sparse Representation in Homotopy Hyperspectral Data Classification via Sparse Representation in Homotopy Qazi Sami ul Haq,Lixin Shi,Linmi Tao,Shiqiang Yang Key Laboratory of Pervasive Computing, Ministry of Education Department of Computer

More information

Statistics 202: Data Mining. c Jonathan Taylor. Outliers Based in part on slides from textbook, slides of Susan Holmes.

Statistics 202: Data Mining. c Jonathan Taylor. Outliers Based in part on slides from textbook, slides of Susan Holmes. Outliers Based in part on slides from textbook, slides of Susan Holmes December 2, 2012 1 / 1 Concepts What is an outlier? The set of data points that are considerably different than the remainder of the

More information

The Surprising Power of Belief Propagation

The Surprising Power of Belief Propagation The Surprising Power of Belief Propagation Elchanan Mossel June 12, 2015 Why do you want to know about BP It s a popular algorithm. We will talk abut its analysis. Many open problems. Connections to: Random

More information

Convexity Theory and Gradient Methods

Convexity Theory and Gradient Methods Convexity Theory and Gradient Methods Angelia Nedić angelia@illinois.edu ISE Department and Coordinated Science Laboratory University of Illinois at Urbana-Champaign Outline Convex Functions Optimality

More information

2. Data Preprocessing

2. Data Preprocessing 2. Data Preprocessing Contents of this Chapter 2.1 Introduction 2.2 Data cleaning 2.3 Data integration 2.4 Data transformation 2.5 Data reduction Reference: [Han and Kamber 2006, Chapter 2] SFU, CMPT 459

More information

Spatial Outlier Detection

Spatial Outlier Detection Spatial Outlier Detection Chang-Tien Lu Department of Computer Science Northern Virginia Center Virginia Tech Joint work with Dechang Chen, Yufeng Kou, Jiang Zhao 1 Spatial Outlier A spatial data point

More information

CSE P521: Applied Algorithms

CSE P521: Applied Algorithms CSE P51: Applied Algorithms Instructor: Prof. James R. Lee TAs: Evan McCarty (head), Jeffrey Hon Office hours: TBA Class e-mail list: Sign up at course site if you didn t receive hello email Discussion

More information

MATH3016: OPTIMIZATION

MATH3016: OPTIMIZATION MATH3016: OPTIMIZATION Lecturer: Dr Huifu Xu School of Mathematics University of Southampton Highfield SO17 1BJ Southampton Email: h.xu@soton.ac.uk 1 Introduction What is optimization? Optimization is

More information

The flare Package for High Dimensional Linear Regression and Precision Matrix Estimation in R

The flare Package for High Dimensional Linear Regression and Precision Matrix Estimation in R Journal of Machine Learning Research 6 (205) 553-557 Submitted /2; Revised 3/4; Published 3/5 The flare Package for High Dimensional Linear Regression and Precision Matrix Estimation in R Xingguo Li Department

More information

The exam is closed book, closed notes except your one-page (two-sided) cheat sheet.

The exam is closed book, closed notes except your one-page (two-sided) cheat sheet. CS 189 Spring 2015 Introduction to Machine Learning Final You have 2 hours 50 minutes for the exam. The exam is closed book, closed notes except your one-page (two-sided) cheat sheet. No calculators or

More information

FMA901F: Machine Learning Lecture 3: Linear Models for Regression. Cristian Sminchisescu

FMA901F: Machine Learning Lecture 3: Linear Models for Regression. Cristian Sminchisescu FMA901F: Machine Learning Lecture 3: Linear Models for Regression Cristian Sminchisescu Machine Learning: Frequentist vs. Bayesian In the frequentist setting, we seek a fixed parameter (vector), with value(s)

More information

Robust Kernel Methods in Clustering and Dimensionality Reduction Problems

Robust Kernel Methods in Clustering and Dimensionality Reduction Problems Robust Kernel Methods in Clustering and Dimensionality Reduction Problems Jian Guo, Debadyuti Roy, Jing Wang University of Michigan, Department of Statistics Introduction In this report we propose robust

More information

Re-rendering from a Dense/Sparse Set of Images

Re-rendering from a Dense/Sparse Set of Images Re-rendering from a Dense/Sparse Set of Images Ko Nishino Institute of Industrial Science The Univ. of Tokyo (Japan Science and Technology) kon@cvl.iis.u-tokyo.ac.jp Virtual/Augmented/Mixed Reality Three

More information

The Capacity of Wireless Networks

The Capacity of Wireless Networks The Capacity of Wireless Networks Piyush Gupta & P.R. Kumar Rahul Tandra --- EE228 Presentation Introduction We consider wireless networks without any centralized control. Try to analyze the capacity of

More information

3D Computer Vision II. Reminder Projective Geometry, Transformations. Nassir Navab. October 27, 2009

3D Computer Vision II. Reminder Projective Geometry, Transformations. Nassir Navab. October 27, 2009 3D Computer Vision II Reminder Projective Geometr, Transformations Nassir Navab based on a course given at UNC b Marc Pollefes & the book Multiple View Geometr b Hartle & Zisserman October 27, 29 2D Transformations

More information

Lecture 11: Clustering and the Spectral Partitioning Algorithm A note on randomized algorithm, Unbiased estimates

Lecture 11: Clustering and the Spectral Partitioning Algorithm A note on randomized algorithm, Unbiased estimates CSE 51: Design and Analysis of Algorithms I Spring 016 Lecture 11: Clustering and the Spectral Partitioning Algorithm Lecturer: Shayan Oveis Gharan May nd Scribe: Yueqi Sheng Disclaimer: These notes have

More information

Spectral clustering of linear subspaces for motion segmentation

Spectral clustering of linear subspaces for motion segmentation Spectral clustering of linear subspaces for motion segmentation Fabien Lauer, Christoph Schnörr To cite this version: Fabien Lauer, Christoph Schnörr Spectral clustering of linear subspaces for motion

More information

Convex Optimization MLSS 2015

Convex Optimization MLSS 2015 Convex Optimization MLSS 2015 Constantine Caramanis The University of Texas at Austin The Optimization Problem minimize : f (x) subject to : x X. The Optimization Problem minimize : f (x) subject to :

More information

Integral Geometry and the Polynomial Hirsch Conjecture

Integral Geometry and the Polynomial Hirsch Conjecture Integral Geometry and the Polynomial Hirsch Conjecture Jonathan Kelner, MIT Partially based on joint work with Daniel Spielman Introduction n A lot of recent work on Polynomial Hirsch Conjecture has focused

More information

Big Data Analytics. Special Topics for Computer Science CSE CSE Feb 11

Big Data Analytics. Special Topics for Computer Science CSE CSE Feb 11 Big Data Analytics Special Topics for Computer Science CSE 4095-001 CSE 5095-005 Feb 11 Fei Wang Associate Professor Department of Computer Science and Engineering fei_wang@uconn.edu Clustering II Spectral

More information

Visual Representations for Machine Learning

Visual Representations for Machine Learning Visual Representations for Machine Learning Spectral Clustering and Channel Representations Lecture 1 Spectral Clustering: introduction and confusion Michael Felsberg Klas Nordberg The Spectral Clustering

More information

Today. Lecture 4: Last time. The EM algorithm. We examine clustering in a little more detail; we went over it a somewhat quickly last time

Today. Lecture 4: Last time. The EM algorithm. We examine clustering in a little more detail; we went over it a somewhat quickly last time Today Lecture 4: We examine clustering in a little more detail; we went over it a somewhat quickly last time The CAD data will return and give us an opportunity to work with curves (!) We then examine

More information

Globally and Locally Consistent Unsupervised Projection

Globally and Locally Consistent Unsupervised Projection Proceedings of the Twenty-Eighth AAAI Conference on Artificial Intelligence Globally and Locally Consistent Unsupervised Projection Hua Wang, Feiping Nie, Heng Huang Department of Electrical Engineering

More information

Processing of binary images

Processing of binary images Binary Image Processing Tuesday, 14/02/2017 ntonis rgyros e-mail: argyros@csd.uoc.gr 1 Today From gray level to binary images Processing of binary images Mathematical morphology 2 Computer Vision, Spring

More information

K-Means and Gaussian Mixture Models

K-Means and Gaussian Mixture Models K-Means and Gaussian Mixture Models David Rosenberg New York University June 15, 2015 David Rosenberg (New York University) DS-GA 1003 June 15, 2015 1 / 43 K-Means Clustering Example: Old Faithful Geyser

More information

NULL SPACE CLUSTERING WITH APPLICATIONS TO MOTION SEGMENTATION AND FACE CLUSTERING

NULL SPACE CLUSTERING WITH APPLICATIONS TO MOTION SEGMENTATION AND FACE CLUSTERING NULL SPACE CLUSTERING WITH APPLICATIONS TO MOTION SEGMENTATION AND FACE CLUSTERING Pan Ji, Yiran Zhong, Hongdong Li, Mathieu Salzmann, Australian National University, Canberra NICTA, Canberra {pan.ji,hongdong.li}@anu.edu.au,mathieu.salzmann@nicta.com.au

More information

Chapter 2 Basic Structure of High-Dimensional Spaces

Chapter 2 Basic Structure of High-Dimensional Spaces Chapter 2 Basic Structure of High-Dimensional Spaces Data is naturally represented geometrically by associating each record with a point in the space spanned by the attributes. This idea, although simple,

More information

Segmentation: Clustering, Graph Cut and EM

Segmentation: Clustering, Graph Cut and EM Segmentation: Clustering, Graph Cut and EM Ying Wu Electrical Engineering and Computer Science Northwestern University, Evanston, IL 60208 yingwu@northwestern.edu http://www.eecs.northwestern.edu/~yingwu

More information

Generalized trace ratio optimization and applications

Generalized trace ratio optimization and applications Generalized trace ratio optimization and applications Mohammed Bellalij, Saïd Hanafi, Rita Macedo and Raca Todosijevic University of Valenciennes, France PGMO Days, 2-4 October 2013 ENSTA ParisTech PGMO

More information

An R Package flare for High Dimensional Linear Regression and Precision Matrix Estimation

An R Package flare for High Dimensional Linear Regression and Precision Matrix Estimation An R Package flare for High Dimensional Linear Regression and Precision Matrix Estimation Xingguo Li Tuo Zhao Xiaoming Yuan Han Liu Abstract This paper describes an R package named flare, which implements

More information

Network Traffic Measurements and Analysis

Network Traffic Measurements and Analysis DEIB - Politecnico di Milano Fall, 2017 Introduction Often, we have only a set of features x = x 1, x 2,, x n, but no associated response y. Therefore we are not interested in prediction nor classification,

More information

Subspace Clustering by Block Diagonal Representation

Subspace Clustering by Block Diagonal Representation IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE Subspace Clustering by Block Diagonal Representation Canyi Lu, Student Member, IEEE, Jiashi Feng, Zhouchen Lin, Senior Member, IEEE, Tao Mei,

More information

Combinatorial Selection and Least Absolute Shrinkage via The CLASH Operator

Combinatorial Selection and Least Absolute Shrinkage via The CLASH Operator Combinatorial Selection and Least Absolute Shrinkage via The CLASH Operator Volkan Cevher Laboratory for Information and Inference Systems LIONS / EPFL http://lions.epfl.ch & Idiap Research Institute joint

More information

IMA Preprint Series # 2339

IMA Preprint Series # 2339 HYBRID LINEAR MODELING VIA LOCAL BEST-FIT FLATS By Teng Zhang Arthur Szlam Yi Wang and Gilad Lerman IMA Preprint Series # 2339 ( October 2010 ) INSTITUTE FOR MATHEMATICS AND ITS APPLICATIONS UNIVERSITY

More information

Computational Geometry

Computational Geometry Casting a polyhedron CAD/CAM systems CAD/CAM systems allow you to design objects and test how they can be constructed Many objects are constructed used a mold Casting Casting A general question: Given

More information

LEARNING COMPRESSED IMAGE CLASSIFICATION FEATURES. Qiang Qiu and Guillermo Sapiro. Duke University, Durham, NC 27708, USA

LEARNING COMPRESSED IMAGE CLASSIFICATION FEATURES. Qiang Qiu and Guillermo Sapiro. Duke University, Durham, NC 27708, USA LEARNING COMPRESSED IMAGE CLASSIFICATION FEATURES Qiang Qiu and Guillermo Sapiro Duke University, Durham, NC 2778, USA ABSTRACT Learning a transformation-based dimension reduction, thereby compressive,

More information

Parameterization of triangular meshes

Parameterization of triangular meshes Parameterization of triangular meshes Michael S. Floater November 10, 2009 Triangular meshes are often used to represent surfaces, at least initially, one reason being that meshes are relatively easy to

More information

Subspace Clustering by Block Diagonal Representation

Subspace Clustering by Block Diagonal Representation IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE Subspace Clustering by Block Diagonal Representation Canyi Lu, Student Member, IEEE, Jiashi Feng, Zhouchen Lin, Senior Member, IEEE, Tao Mei,

More information

Generalized Transitive Distance with Minimum Spanning Random Forest

Generalized Transitive Distance with Minimum Spanning Random Forest Generalized Transitive Distance with Minimum Spanning Random Forest Author: Zhiding Yu and B. V. K. Vijaya Kumar, et al. Dept of Electrical and Computer Engineering Carnegie Mellon University 1 Clustering

More information

6 Randomized rounding of semidefinite programs

6 Randomized rounding of semidefinite programs 6 Randomized rounding of semidefinite programs We now turn to a new tool which gives substantially improved performance guarantees for some problems We now show how nonlinear programming relaxations can

More information

Lecture 22 : Distributed Systems for ML

Lecture 22 : Distributed Systems for ML 10-708: Probabilistic Graphical Models, Spring 2017 Lecture 22 : Distributed Systems for ML Lecturer: Qirong Ho Scribes: Zihang Dai, Fan Yang 1 Introduction Big data has been very popular in recent years.

More information

3. Data Preprocessing. 3.1 Introduction

3. Data Preprocessing. 3.1 Introduction 3. Data Preprocessing Contents of this Chapter 3.1 Introduction 3.2 Data cleaning 3.3 Data integration 3.4 Data transformation 3.5 Data reduction SFU, CMPT 740, 03-3, Martin Ester 84 3.1 Introduction Motivation

More information

Lecture 2 September 3

Lecture 2 September 3 EE 381V: Large Scale Optimization Fall 2012 Lecture 2 September 3 Lecturer: Caramanis & Sanghavi Scribe: Hongbo Si, Qiaoyang Ye 2.1 Overview of the last Lecture The focus of the last lecture was to give

More information

Factorization with Missing and Noisy Data

Factorization with Missing and Noisy Data Factorization with Missing and Noisy Data Carme Julià, Angel Sappa, Felipe Lumbreras, Joan Serrat, and Antonio López Computer Vision Center and Computer Science Department, Universitat Autònoma de Barcelona,

More information

On Order-Constrained Transitive Distance

On Order-Constrained Transitive Distance On Order-Constrained Transitive Distance Author: Zhiding Yu and B. V. K. Vijaya Kumar, et al. Dept of Electrical and Computer Engineering Carnegie Mellon University 1 Clustering Problem Important Issues:

More information