Out-of-Sample Extensions for LLE, Isomap, MDS, Eigenmaps, and Spectral Clustering

Size: px
Start display at page:

Download "Out-of-Sample Extensions for LLE, Isomap, MDS, Eigenmaps, and Spectral Clustering"

Transcription

1 Out-of-Sample Extensons for LLE, Isomap, MDS, Egenmaps, and Spectral Clusterng Yoshua Bengo, Jean-Franços Paement, Pascal Vncent Olver Delalleau, Ncolas Le Roux and Mare Oumet Département d Informatque et Recherche Opératonnelle Unversté de Montréal Montréal, Québec, Canada, H3C 3J7 Abstract Several unsupervsed learnng algorthms based on an egendecomposton provde ether an embeddng or a clusterng only for gven tranng ponts, wth no straghtforward extenson for out-of-sample examples short of recomputng egenvectors. Ths paper provdes a unfed framework for extendng Local Lnear Embeddng (LLE), Isomap, Laplacan Egenmaps, Mult-Dmensonal Scalng (for dmensonalty reducton) as well as for Spectral Clusterng. Ths framework s based on seeng these algorthms as learnng egenfunctons of a data-dependent kernel. Numercal experments show that the generalzatons performed have a level of error comparable to the varablty of the embeddng algorthms due to the choce of tranng data. 1 Introducton Many unsupervsed learnng algorthms have been recently proposed, all usng an egendecomposton for obtanng a lower-dmensonal embeddng of data lyng on a non-lnear manfold: Local Lnear Embeddng (LLE) (Rowes and Saul, 2000), Isomap (Tenenbaum, de Slva and Langford, 2000) and Laplacan Egenmaps (Belkn and Nyog, 2003). There are also many varants of Spectral Clusterng (Wess, 1999; Ng, Jordan and Wess, 2002), n whch such an embeddng s an ntermedate step before obtanng a clusterng of the data that can capture flat, elongated and even curved clusters. The two tasks (manfold learnng and clusterng) are lnked because the clusters found by spectral clusterng can be arbtrary curved manfolds (as long as there s enough data to locally capture ther curvature). 2 Common Framework In ths paper we consder fve types of unsupervsed learnng algorthms that can be cast n the same framework, based on the computaton of an embeddng for the tranng ponts obtaned from the prncpal egenvectors of a symmetrc matrx. Algorthm 1 1. Start from a data set D = {x 1,..., x n } wth n ponts n R d. Construct a n n neghborhood or smlarty matrx M. Let us denote K D (, ) (or K for shorthand) the data-dependent functon whch produces M by M j = K D (x, x j ). 2. Optonally transform M, yeldng a normalzed matrx M. Equvalently, ths corresponds to generatng M from a K D by M j = K D (x, x j ).

2 3. Compute the m largest postve egenvalues λ k and egenvectors v k of M. 4. The embeddng of each example x s the vector y wth y k the -th element of the k-th prncpal egenvector v k of M. Alternatvely (MDS and Isomap), the embeddng s e, wth e k = λ k y k. If the frst m egenvalues are postve, then e e j s the best approxmaton of M j usng only m coordnates, n the squared error sense. In the followng, we consder the specalzatons of Algorthm 1 for dfferent unsupervsed learnng algorthms. Let S be the -th row sum of the affnty matrx M: S = j M j. (1) We say that two ponts (a, b) are k-nearest-neghbors of each other f a s among the k nearest neghbors of b n D {a} or vce-versa. We denote by x j the j-th coordnate of the vector x. 2.1 Mult-Dmensonal Scalng Mult-Dmensonal Scalng (MDS) starts from a noton of dstance or affnty K that s computed between each par of tranng examples. We consder here metrc MDS (Cox and Cox, 1994). For the normalzaton step 2 n Algorthm 1, these dstances are converted to equvalent dot products usng ( the double-centerng formula: ) M j = 1 M j 1 2 n S 1 n S j + 1 n 2 S k. (2) The embeddng e k of example x s gven by λ k v k. 2.2 Spectral Clusterng Spectral clusterng (Wess, 1999) can yeld mpressvely good results where tradtonal clusterng lookng for round blobs n the data, such as K-means, would fal mserably. It s based on two man steps: frst embeddng the data ponts n a space n whch clusters are more obvous (usng the egenvectors of a Gram matrx), and then applyng a classcal clusterng algorthm such as K-means, e.g. as n (Ng, Jordan and Wess, 2002). The affnty matrx M s formed usng a kernel such as the Gaussan kernel. Several normalzaton steps have been proposed. Among the most successful ones, as advocated n (Wess, 1999; Ng, Jordan and Wess, 2002), s the followng: M j = k M j S S j. (3) To obtan m clusters, the frst m prncpal egenvectors of M are computed and K-means s appled on the unt-norm coordnates, obtaned from the embeddng y k = v k. 2.3 Laplacan Egenmaps Laplacan Egenmaps s a recently proposed dmensonalty reducton procedure (Belkn and Nyog, 2003) that has been proposed for sem-supervsed learnng. The authors use an approxmaton of the Laplacan operator such as the Gaussan kernel or the matrx whose element (, j) s 1 f x and x j are k-nearest-neghbors and 0 otherwse. Instead of solvng an ordnary egenproblem, the followng generalzed egenproblem s solved: (S M)v j = λ j Sv j (4) wth egenvalues λ j, egenvectors v j and S the dagonal matrx wth entres gven by eq. (1). The smallest egenvalue s left out and the egenvectors correspondng to the other small egenvalues are used for the embeddng. Ths s the same embeddng that s computed wth the spectral clusterng algorthm from (Sh and Malk, 1997). As noted n (Wess, 1999) (Normalzaton Lemma 1), an equvalent result (up to a componentwse scalng of the embeddng) can be obtaned by consderng the prncpal egenvectors of the normalzed matrx defned n eq. (3).

3 2.4 Isomap Isomap (Tenenbaum, de Slva and Langford, 2000) generalzes MDS to non-lnear manfolds. It s based on replacng the Eucldean dstance by an approxmaton of the geodesc dstance on the manfold. We defne the geodesc dstance wth respect to a data set D, a dstance d(u, v) and a neghborhood k as follows: D(a, b) = mn p d(p, p +1 ) (5) where p s a sequence of ponts of length l 2 wth p 1 = a, p l = b, p D {2,..., l 1} and (p,p +1 ) are k-nearest-neghbors. The length l s free n the mnmzaton. The Isomap algorthm obtans the normalzed matrx M from whch the embeddng s derved by transformng the raw parwse dstances matrx as follows: frst compute the matrx M j = D 2 (x, x j ) of squared geodesc dstances wth respect to the data D, then apply to ths matrx the dstance-to-dot-product transformaton (eq. (2)), as for MDS. As n MDS, the embeddng s e k = λ k v k rather than y k = v k. 2.5 LLE The Local Lnear Embeddng (LLE) algorthm (Rowes and Saul, 2000) looks for an embeddng that preserves the local geometry n the neghborhood of each data pont. Frst, a sparse matrx of local predctve weghts W j s computed, such that j W j = 1, W j = 0 f x j s not a k-nearest-neghbor of x and ( j W jx j x ) 2 s mnmzed. Then the matrx M = (I W ) (I W ) (6) s formed. The embeddng s obtaned from the lowest egenvectors of M, except for the smallest egenvector whch s unnterestng because t s (1, 1,... 1), wth egenvalue 0. Note that the lowest egenvectors of M are the largest egenvectors of M µ = µi M to ft Algorthm 1 (the use of µ > 0 wll be dscussed n secton 4.4). The embeddng s gven by y k = v k, and s constant wth respect to µ. 3 From Egenvectors to Egenfunctons To obtan an embeddng for a new data pont, we propose to use the Nyström formula (eq. 9) (Baker, 1977), whch has been used successfully to speed-up kernel methods computatons by focussng the heaver computatons (the egendecomposton) on a subset of examples. The use of ths formula can be justfed by consderng the convergence of egenvectors and egenvalues, as the number of examples ncreases (Baker, 1977; Wllams and Seeger, 2000; Koltchnsk and Gné, 2000; Shawe-Taylor and Wllams, 2003). Intutvely, the extensons to obtan the embeddng for a new example requre specfyng a new column of the Gram matrx M, through a tranng-set dependent kernel functon K D, n whch one of the arguments may be requred to be n the tranng set. If we start from a data set D, obtan an embeddng for ts elements, and add more and more data, the embeddng for the ponts n D converges (for egenvalues that are unque). (Shawe-Taylor and Wllams, 2003) gve bounds on the convergence error (n the case of kernel PCA). In the lmt, we expect each egenvector to converge to an egenfuncton for the lnear operator defned below, n the sense that the -th element of the k-th egenvector converges to the applcaton of the k-th egenfuncton to x (up to a normalzaton factor). Consder a Hlbert space H p of functons wth nner product f, g p = f(x)g(x)p(x)dx, wth a densty functon p(x). Assocate wth kernel K a lnear operator K p n H p : (K p f)(x) = K(x, y)f(y)p(y)dy. (7) We don t know the true densty p but we can approxmate the above nner product and lnear operator (and ts egenfunctons) usng the emprcal dstrbuton ˆp. An emprcal Hlbert space Hˆp s thus defned usng ˆp nstead of p. Note that the proposton below can be

4 appled even f the kernel s not postve sem-defnte, although the embeddng algorthms we have studed are restrcted to usng the prncpal coordnates assocated wth postve egenvalues. For a more rgorous mathematcal analyss, see (Bengo et al., 2003). Proposton 1 Let K(a, b) be a kernel functon, not necessarly postve sem-defnte, that gves rse to a symmetrc matrx M wth entres M j = K(x, x j ) upon a dataset D = {x 1,..., x n }. Let (v k, λ k ) be an (egenvector,egenvalue) par that solves Mv k = λ k v k. Let (f k, λ k ) be an (egenfuncton,egenvalue) par that solves ( Kˆp f k )(x) = λ k f k(x) for any x, wth ˆp the emprcal dstrbuton over D. Let e k (x) = y k (x) λ k or y k (x) denote the embeddng assocated wth a new pont x. Then λ k = 1 n λ k (8) n n f k (x) = v k K(x, x ) (9) λ k =1 f k (x ) = nv k (10) y k (x) = f k(x) = 1 n v k K(x, x ) (11) n λ k =1 y k (x ) = y k, e k (x ) = e k (12) See (Bengo et al., 2003) for a proof and further justfcatons of the above formulae. The generalzed embeddng for Isomap and MDS s e k (x) = λ k y k (x) whereas the one for spectral clusterng, Laplacan egenmaps and LLE s y k (x). Proposton 2 In addton, f the data-dependent kernel K D s postve sem-defnte, then n f k (x) = π k (x) λ k where π k (x) s the k-th component of the kernel PCA projecton of x obtaned from the kernel K D (up to centerng). Ths relaton wth kernel PCA (Schölkopf, Smola and Müller, 1998), already ponted out n (Wllams and Seeger, 2000), s further dscussed n (Bengo et al., 2003). 4 Extendng to new Ponts Usng Proposton 1, one obtans a natural extenson of all the unsupervsed learnng algorthms mapped to Algorthm 1, provded we can wrte down a kernel functon K that gves rse to the matrx M on D, and can be used n eq. (11) to generalze the embeddng. We consder each of them n turn below. In addton to the convergence propertes dscussed n secton 3, another justfcaton for usng equaton (9) s gven by the followng proposton: Proposton 3 If we defne the f k (x ) by eq. (10) and take a new pont x, the value of f k (x) that mnmzes ( 2 n m K(x, x ) λ tf t (x)f t (x )) (13) =1 t=1 s gven by eq. (9), for m 1 and any k m. The proof s a drect consequence of the orthogonalty of the egenvectors v k. Ths proposton lnks equatons (9) and (10). Indeed, we can obtan eq. (10) when tryng to approxmate

5 K at the data ponts by mnmzng ( the cost n m K(x, x j ) λ tf t (x )f t (x j ),j=1 t=1 for m = 1, 2,... When we add a new pont x, t s thus natural to use the same cost to approxmate the K(x, x ), whch yelds (13). Note that by dong so, we do not seek to approxmate K(x, x). Future work should nvestgate embeddngs whch mnmze the emprcal reconstructon error of K but gnore the dagonal contrbutons. 4.1 Extendng MDS For MDS, a normalzed kernel can be defned as follows, usng a contnuous verson of the double-centerng eq. (2): K(a, b) = 1 2 (d2 (a, b) E x [d 2 (x, b)] E x [d 2 (a, x )] + E x,x [d 2 (x, x )]) (14) where d(a, b) s the orgnal dstance and the expectatons are taken over the emprcal data D. An extenson of metrc MDS to new ponts has already been proposed n (Gower, 1968), solvng exactly for the embeddng of x to be consstent wth ts dstances to tranng ponts, whch n general requres addng a new dmenson. 4.2 Extendng Spectral Clusterng and Laplacan Egenmaps Both the verson of Spectral Clusterng and Laplacan Egenmaps descrbed above are based on an ntal kernel K, such as the Gaussan or nearest-neghbor kernel. An equvalent normalzed kernel s: K(a, b) = 1 K(a, b) n Ex [K(a, x)]e x [K(b, x )] where the expectatons are taken over the emprcal data D. 4.3 Extendng Isomap To extend Isomap, the test pont s not used n computng the geodesc dstance between tranng ponts, otherwse we would have to recompute all the geodesc dstances. A reasonable soluton s to use the defnton of D(a, b) n eq. (5), whch only uses the tranng ponts n the ntermedate ponts on the path from a to b. We obtan a normalzed kernel by applyng the contnuous double-centerng of eq. (14) wth d = D. A formula has already been proposed (de Slva and Tenenbaum, 2003) to approxmate Isomap usng only a subset of the examples (the landmark ponts) to compute the egenvectors. Usng our notatons, ths formula s e k(x) = 1 2 v k (E x [ D 2 (x, x )] D 2 (x, x)). (15) λ k where E x s an average over the data set. The formula s appled to obtan an embeddng for the non-landmark examples. Corollary 1 The embeddng proposed n Proposton 1 for Isomap (e k (x)) s equal to formula 15 (Landmark Isomap) when K(x, y) s defned as n eq. (14) wth d = D. Proof: the proof reles on a property of the Gram matrx for Isomap: M j = 0, by constructon. Therefore (1, 1,... 1) s an egenvector wth egenvalue 0, and all the other egenvectors v k have the property v k = 0 because of the orthogonalty wth (1, 1,... 1). Wrtng (E x [ D 2 (x, x )] D 2 (x, x )) = 2 K(x, x )+E x,x [ D 2 (x, x )] E x [ D 2 (x, x )] yelds e k (x) = 2 2 λ k v K(x, k x ) + (E x,x [ D 2 (x, x )] E x [ D 2 (x, x )]) v k = e k (x), snce the last sum s 0. ) 2

6 4.4 Extendng LLE The extenson of LLE s the most challengng one because t does not ft as well the framework of Algorthm 1: the M matrx for LLE does not have a clear nterpretaton n terms of dstance or dot product. An extenson has been proposed n (Saul and Rowes, 2002), but unfortunately t cannot be cast drectly nto the framework of Proposton 1. Ther embeddng of a new pont x s gven by n y k (x) = y k (x )w(x, x ) (16) =1 where w(x, x ) s the weght of x n the reconstructon of x by ts k-nearest-neghbors n the tranng set (f x = x j D, w(x, x ) = δ j ). Ths s very close to eq. (11), but lacks the normalzaton by λ k. However, we can see ths embeddng as a lmt case of Proposton 1, as shown below. We frst need to defne a kernel K µ such that K µ (x, x j ) = M µ,j = (µ 1)δ j + W j + W j k W k W kj (17) for x, x j D. Let us defne a kernel K by K (x, x) = K (x, x ) = w(x, x ) and K (x, y) = 0 when nether x nor y s n the tranng set D. Let K be defned by K (x, x j ) = W j + W j k W k W kj and K (x, y) = 0 when ether x or y sn t n D. Then, by constructon, the kernel Kµ = (µ 1) K + K verfes eq. (17). Thus, we can apply eq. (11) to obtan an embeddng of a new pont x, whch yelds y µ,k (x) = 1 y k ((µ 1) λ K (x, x ) + K ) (x, x ) k wth λ k = (µ ˆλ k ), and ˆλ k beng the k-th lowest egenvalue of M. Ths rewrtes nto y µ,k (x) = µ 1 µ ˆλ y k w(x, x ) + 1 k µ ˆλ y K k (x, x ). k Then when µ, y µ,k (x) y k (x) defned by eq. (16). Snce the choce of µ s free, we can thus consder eq. (16) as approxmatng the use of the kernel Kµ wth a large µ n Proposton 1. Ths s what we have done n the experments descrbed n the next secton. Note however that we can fnd smoother kernels K µ verfyng eq. (17), gvng other extensons of LLE from Proposton 1. It s out of the scope of ths paper to study whch kernel s best for generalzaton, but t seems desrable to use a smooth kernel that would take nto account not only the reconstructon of x by ts neghbors x, but also the reconstructon of the x by ther neghbors ncludng the new pont x. 5 Experments We want to evaluate whether the precson of the generalzatons suggested n the prevous secton s comparable to the ntrnsc perturbatons of the embeddng algorthms. The perturbaton analyss wll be acheved by consderng splts of the data n three sets, D = F R 1 R 2 and tranng ether wth F R 1 or F R 2, comparng the embeddngs on F. For each algorthm descrbed n secton 2, we apply the followng procedure:

7 10 x 10 4 x x Fgure 1: Tranng set varablty mnus out-of-sample error, wrt the proporton of tranng samples substtuted. Top left: MDS. Top rght: spectral clusterng or Laplacan egenmaps. Bottom left: Isomap. Bottom rght: LLE. Error bars are 95% confdence ntervals. 1. We choose F D wth m = F samples. The remanng n m samples n D/F are splt nto two equal sze subsets R 1 and R 2. We tran (obtan the egenvectors) over F R 1 and F R 2. When egenvalues are close, the estmated egenvectors are unstable and can rotate n the subspace they span. Thus we estmate an affne algnment between the two embeddngs usng the ponts n F, and we calculate the Eucldean dstance between the algned embeddngs obtaned for each s F. 2. For each sample s F, we also tran over {F R 1 }/{s }. We apply the extenson to out-of-sample ponts to fnd the predcted embeddng of s and calculate the Eucldean dstance between ths embeddng and the one obtaned when tranng wth F R 1,.e. wth s n the tranng set. 3. We calculate the mean dfference (and ts standard error, shown n the fgure) between the dstance obtaned n step 1 and the one obtaned n step 2 for each sample s F, and we repeat ths experment for varous szes of F. The results obtaned for MDS, Isomap, spectral clusterng and LLE are shown n fgure 1 for dfferent values of m. Experments are done over a database of 698 synthetc face mages descrbed by 4096 components that s avalable at Qualtatvely smlar results have been obtaned over other databases such as Ionosphere ( mlearn/mlsummary.html) and swssroll ( rowes/lle/). Each algorthm generates a twodmensonal embeddng of the mages, followng the experments reported for Isomap. The number of neghbors s 10 for Isomap and LLE, and a Gaussan kernel wth a standard devaton of 0.01 s used for spectral clusterng / Laplacan egenmaps. 95% confdence

8 ntervals are drawn besde each mean dfference of error on the fgure. As expected, the mean dfference between the two dstances s almost monotoncally ncreasng as the fracton of substtuted examples grows (x-axs n the fgure). In most cases, the out-of-sample error s less than or comparable to the tranng set embeddng stablty: t corresponds to substtutng a fracton of between 1 and 4% of the tranng examples. 6 Conclusons In ths paper we have presented an extenson to fve unsupervsed learnng algorthms based on a spectral embeddng of the data: MDS, spectral clusterng, Laplacan egenmaps, Isomap and LLE. Ths extenson allows one to apply a traned model to out-ofsample ponts wthout havng to recompute egenvectors. It ntroduces a noton of functon nducton and generalzaton error for these algorthms. The experments on real hghdmensonal data show that the average dstance between the out-of-sample and n-sample embeddngs s comparable or lower than the varaton n n-sample embeddng due to replacng a few ponts n the tranng set. References Baker, C. (1977). The numercal treatment of ntegral equatons. Clarendon Press, Oxford. Belkn, M. and Nyog, P. (2003). Laplacan egenmaps for dmensonalty reducton and data representaton. Neural Computaton, 15(6): Bengo, Y., Vncent, P., Paement, J., Delalleau, O., Oumet, M., and Le Roux, N. (2003). Spectral clusterng and kernel pca are learnng egenfunctons. Techncal report, Département d nformatque et recherche opératonnelle, Unversté de Montréal. Cox, T. and Cox, M. (1994). Multdmensonal Scalng. Chapman & Hall, London. de Slva, V. and Tenenbaum, J. (2003). Global versus local methods n nonlnear dmensonalty reducton. In Becker, S., Thrun, S., and Obermayer, K., edtors, Advances n Neural Informaton Processng Systems 15, pages , Cambrdge, MA. MIT Press. Gower, J. (1968). Addng a pont to vector dagrams n multvarate analyss. Bometrka, 55(3): Koltchnsk, V. and Gné, E. (2000). Random matrx approxmaton of spectra of ntegral operators. Bernoull, 6(1): Ng, A. Y., Jordan, M. I., and Wess, Y. (2002). On spectral clusterng: analyss and an algorthm. In Detterch, T. G., Becker, S., and Ghahraman, Z., edtors, Advances n Neural Informaton Processng Systems 14, Cambrdge, MA. MIT Press. Rowes, S. and Saul, L. (2000). Nonlnear dmensonalty reducton by locally lnear embeddng. Scence, 290(5500): Saul, L. and Rowes, S. (2002). Thnk globally, ft locally: unsupervsed learnng of low dmensonal manfolds. Journal of Machne Learnng Research, 4: Schölkopf, B., Smola, A., and Müller, K.-R. (1998). Nonlnear component analyss as a kernel egenvalue problem. Neural Computaton, 10: Shawe-Taylor, J. and Wllams, C. (2003). The stablty of kernel prncpal components analyss and ts relaton to the process egenspectrum. In Becker, S., Thrun, S., and Obermayer, K., edtors, Advances n Neural Informaton Processng Systems 15. MIT Press. Sh, J. and Malk, J. (1997). Normalzed cuts and mage segmentaton. In Proc. IEEE Conf. Computer Vson and Pattern Recognton, pages Tenenbaum, J., de Slva, V., and Langford, J. (2000). A global geometrc framework for nonlnear dmensonalty reducton. Scence, 290(5500): Wess, Y. (1999). Segmentaton usng egenvectors: a unfyng vew. In Proceedngs IEEE Internatonal Conference on Computer Vson, pages Wllams, C. and Seeger, M. (2000). The effect of the nput densty dstrbuton on kernel-based classfers. In Proceedngs of the Seventeenth Internatonal Conference on Machne Learnng. Morgan Kaufmann.

LECTURE : MANIFOLD LEARNING

LECTURE : MANIFOLD LEARNING LECTURE : MANIFOLD LEARNING Rta Osadchy Some sldes are due to L.Saul, V. C. Raykar, N. Verma Topcs PCA MDS IsoMap LLE EgenMaps Done! Dmensonalty Reducton Data representaton Inputs are real-valued vectors

More information

Face Recognition University at Buffalo CSE666 Lecture Slides Resources:

Face Recognition University at Buffalo CSE666 Lecture Slides Resources: Face Recognton Unversty at Buffalo CSE666 Lecture Sldes Resources: http://www.face-rec.org/algorthms/ Overvew of face recognton algorthms Correlaton - Pxel based correspondence between two face mages Structural

More information

Laplacian Eigenmap for Image Retrieval

Laplacian Eigenmap for Image Retrieval Laplacan Egenmap for Image Retreval Xaofe He Partha Nyog Department of Computer Scence The Unversty of Chcago, 1100 E 58 th Street, Chcago, IL 60637 ABSTRACT Dmensonalty reducton has been receved much

More information

Feature Reduction and Selection

Feature Reduction and Selection Feature Reducton and Selecton Dr. Shuang LIANG School of Software Engneerng TongJ Unversty Fall, 2012 Today s Topcs Introducton Problems of Dmensonalty Feature Reducton Statstc methods Prncpal Components

More information

Support Vector Machines

Support Vector Machines /9/207 MIST.6060 Busness Intellgence and Data Mnng What are Support Vector Machnes? Support Vector Machnes Support Vector Machnes (SVMs) are supervsed learnng technques that analyze data and recognze patterns.

More information

Machine Learning 9. week

Machine Learning 9. week Machne Learnng 9. week Mappng Concept Radal Bass Functons (RBF) RBF Networks 1 Mappng It s probably the best scenaro for the classfcaton of two dataset s to separate them lnearly. As you see n the below

More information

Subspace clustering. Clustering. Fundamental to all clustering techniques is the choice of distance measure between data points;

Subspace clustering. Clustering. Fundamental to all clustering techniques is the choice of distance measure between data points; Subspace clusterng Clusterng Fundamental to all clusterng technques s the choce of dstance measure between data ponts; D q ( ) ( ) 2 x x = x x, j k = 1 k jk Squared Eucldean dstance Assumpton: All features

More information

SELECTION OF THE NUMBER OF NEIGHBOURS OF EACH DATA POINT FOR THE LOCALLY LINEAR EMBEDDING ALGORITHM

SELECTION OF THE NUMBER OF NEIGHBOURS OF EACH DATA POINT FOR THE LOCALLY LINEAR EMBEDDING ALGORITHM ISSN 392 24X INFORMATION TECHNOLOGY AND CONTROL, 2007, Vol.36, No.4 SELECTION OF THE NUMBER OF NEIGHBOURS OF EACH DATA POINT FOR THE LOCALLY LINEAR EMBEDDING ALGORITHM Rasa Karbauskatė,2, Olga Kurasova,2,

More information

Learning an Image Manifold for Retrieval

Learning an Image Manifold for Retrieval Learnng an Image Manfold for Retreval Xaofe He*, We-Yng Ma, and Hong-Jang Zhang Mcrosoft Research Asa Bejng, Chna, 100080 {wyma,hjzhang}@mcrosoft.com *Department of Computer Scence, The Unversty of Chcago

More information

Smoothing Spline ANOVA for variable screening

Smoothing Spline ANOVA for variable screening Smoothng Splne ANOVA for varable screenng a useful tool for metamodels tranng and mult-objectve optmzaton L. Rcco, E. Rgon, A. Turco Outlne RSM Introducton Possble couplng Test case MOO MOO wth Game Theory

More information

12/2/2009. Announcements. Parametric / Non-parametric. Case-Based Reasoning. Nearest-Neighbor on Images. Nearest-Neighbor Classification

12/2/2009. Announcements. Parametric / Non-parametric. Case-Based Reasoning. Nearest-Neighbor on Images. Nearest-Neighbor Classification Introducton to Artfcal Intellgence V22.0472-001 Fall 2009 Lecture 24: Nearest-Neghbors & Support Vector Machnes Rob Fergus Dept of Computer Scence, Courant Insttute, NYU Sldes from Danel Yeung, John DeNero

More information

6.854 Advanced Algorithms Petar Maymounkov Problem Set 11 (November 23, 2005) With: Benjamin Rossman, Oren Weimann, and Pouya Kheradpour

6.854 Advanced Algorithms Petar Maymounkov Problem Set 11 (November 23, 2005) With: Benjamin Rossman, Oren Weimann, and Pouya Kheradpour 6.854 Advanced Algorthms Petar Maymounkov Problem Set 11 (November 23, 2005) Wth: Benjamn Rossman, Oren Wemann, and Pouya Kheradpour Problem 1. We reduce vertex cover to MAX-SAT wth weghts, such that the

More information

Classification / Regression Support Vector Machines

Classification / Regression Support Vector Machines Classfcaton / Regresson Support Vector Machnes Jeff Howbert Introducton to Machne Learnng Wnter 04 Topcs SVM classfers for lnearly separable classes SVM classfers for non-lnearly separable classes SVM

More information

Recognizing Faces. Outline

Recognizing Faces. Outline Recognzng Faces Drk Colbry Outlne Introducton and Motvaton Defnng a feature vector Prncpal Component Analyss Lnear Dscrmnate Analyss !"" #$""% http://www.nfotech.oulu.f/annual/2004 + &'()*) '+)* 2 ! &

More information

Semi-Supervised Discriminant Analysis Based On Data Structure

Semi-Supervised Discriminant Analysis Based On Data Structure IOSR Journal of Computer Engneerng (IOSR-JCE) e-issn: 2278-0661,p-ISSN: 2278-8727, Volume 17, Issue 3, Ver. VII (May Jun. 2015), PP 39-46 www.osrournals.org Sem-Supervsed Dscrmnant Analyss Based On Data

More information

Learning the Kernel Parameters in Kernel Minimum Distance Classifier

Learning the Kernel Parameters in Kernel Minimum Distance Classifier Learnng the Kernel Parameters n Kernel Mnmum Dstance Classfer Daoqang Zhang 1,, Songcan Chen and Zh-Hua Zhou 1* 1 Natonal Laboratory for Novel Software Technology Nanjng Unversty, Nanjng 193, Chna Department

More information

Reading. 14. Subdivision curves. Recommended:

Reading. 14. Subdivision curves. Recommended: eadng ecommended: Stollntz, Deose, and Salesn. Wavelets for Computer Graphcs: heory and Applcatons, 996, secton 6.-6., A.5. 4. Subdvson curves Note: there s an error n Stollntz, et al., secton A.5. Equaton

More information

Detection of an Object by using Principal Component Analysis

Detection of an Object by using Principal Component Analysis Detecton of an Object by usng Prncpal Component Analyss 1. G. Nagaven, 2. Dr. T. Sreenvasulu Reddy 1. M.Tech, Department of EEE, SVUCE, Trupath, Inda. 2. Assoc. Professor, Department of ECE, SVUCE, Trupath,

More information

A Binarization Algorithm specialized on Document Images and Photos

A Binarization Algorithm specialized on Document Images and Photos A Bnarzaton Algorthm specalzed on Document mages and Photos Ergna Kavalleratou Dept. of nformaton and Communcaton Systems Engneerng Unversty of the Aegean kavalleratou@aegean.gr Abstract n ths paper, a

More information

Lecture 4: Principal components

Lecture 4: Principal components /3/6 Lecture 4: Prncpal components 3..6 Multvarate lnear regresson MLR s optmal for the estmaton data...but poor for handlng collnear data Covarance matrx s not nvertble (large condton number) Robustness

More information

Machine Learning: Algorithms and Applications

Machine Learning: Algorithms and Applications 14/05/1 Machne Learnng: Algorthms and Applcatons Florano Zn Free Unversty of Bozen-Bolzano Faculty of Computer Scence Academc Year 011-01 Lecture 10: 14 May 01 Unsupervsed Learnng cont Sldes courtesy of

More information

Radial Basis Functions

Radial Basis Functions Radal Bass Functons Mesh Reconstructon Input: pont cloud Output: water-tght manfold mesh Explct Connectvty estmaton Implct Sgned dstance functon estmaton Image from: Reconstructon and Representaton of

More information

Outline. Discriminative classifiers for image recognition. Where in the World? A nearest neighbor recognition example 4/14/2011. CS 376 Lecture 22 1

Outline. Discriminative classifiers for image recognition. Where in the World? A nearest neighbor recognition example 4/14/2011. CS 376 Lecture 22 1 4/14/011 Outlne Dscrmnatve classfers for mage recognton Wednesday, Aprl 13 Krsten Grauman UT-Austn Last tme: wndow-based generc obect detecton basc ppelne face detecton wth boostng as case study Today:

More information

Hermite Splines in Lie Groups as Products of Geodesics

Hermite Splines in Lie Groups as Products of Geodesics Hermte Splnes n Le Groups as Products of Geodescs Ethan Eade Updated May 28, 2017 1 Introducton 1.1 Goal Ths document defnes a curve n the Le group G parametrzed by tme and by structural parameters n the

More information

S1 Note. Basis functions.

S1 Note. Basis functions. S1 Note. Bass functons. Contents Types of bass functons...1 The Fourer bass...2 B-splne bass...3 Power and type I error rates wth dfferent numbers of bass functons...4 Table S1. Smulaton results of type

More information

Content Based Image Retrieval Using 2-D Discrete Wavelet with Texture Feature with Different Classifiers

Content Based Image Retrieval Using 2-D Discrete Wavelet with Texture Feature with Different Classifiers IOSR Journal of Electroncs and Communcaton Engneerng (IOSR-JECE) e-issn: 78-834,p- ISSN: 78-8735.Volume 9, Issue, Ver. IV (Mar - Apr. 04), PP 0-07 Content Based Image Retreval Usng -D Dscrete Wavelet wth

More information

Type-2 Fuzzy Non-uniform Rational B-spline Model with Type-2 Fuzzy Data

Type-2 Fuzzy Non-uniform Rational B-spline Model with Type-2 Fuzzy Data Malaysan Journal of Mathematcal Scences 11(S) Aprl : 35 46 (2017) Specal Issue: The 2nd Internatonal Conference and Workshop on Mathematcal Analyss (ICWOMA 2016) MALAYSIAN JOURNAL OF MATHEMATICAL SCIENCES

More information

Support Vector Machines

Support Vector Machines Support Vector Machnes Decson surface s a hyperplane (lne n 2D) n feature space (smlar to the Perceptron) Arguably, the most mportant recent dscovery n machne learnng In a nutshell: map the data to a predetermned

More information

Edge Detection in Noisy Images Using the Support Vector Machines

Edge Detection in Noisy Images Using the Support Vector Machines Edge Detecton n Nosy Images Usng the Support Vector Machnes Hlaro Gómez-Moreno, Saturnno Maldonado-Bascón, Francsco López-Ferreras Sgnal Theory and Communcatons Department. Unversty of Alcalá Crta. Madrd-Barcelona

More information

MULTI-VIEW ANCHOR GRAPH HASHING

MULTI-VIEW ANCHOR GRAPH HASHING MULTI-VIEW ANCHOR GRAPH HASHING Saehoon Km 1 and Seungjn Cho 1,2 1 Department of Computer Scence and Engneerng, POSTECH, Korea 2 Dvson of IT Convergence Engneerng, POSTECH, Korea {kshkawa, seungjn}@postech.ac.kr

More information

An Accurate Evaluation of Integrals in Convex and Non convex Polygonal Domain by Twelve Node Quadrilateral Finite Element Method

An Accurate Evaluation of Integrals in Convex and Non convex Polygonal Domain by Twelve Node Quadrilateral Finite Element Method Internatonal Journal of Computatonal and Appled Mathematcs. ISSN 89-4966 Volume, Number (07), pp. 33-4 Research Inda Publcatons http://www.rpublcaton.com An Accurate Evaluaton of Integrals n Convex and

More information

Data-dependent Hashing Based on p-stable Distribution

Data-dependent Hashing Based on p-stable Distribution Data-depent Hashng Based on p-stable Dstrbuton Author Ba, Xao, Yang, Hachuan, Zhou, Jun, Ren, Peng, Cheng, Jan Publshed 24 Journal Ttle IEEE Transactons on Image Processng DOI https://do.org/.9/tip.24.2352458

More information

BAYESIAN MULTI-SOURCE DOMAIN ADAPTATION

BAYESIAN MULTI-SOURCE DOMAIN ADAPTATION BAYESIAN MULTI-SOURCE DOMAIN ADAPTATION SHI-LIANG SUN, HONG-LEI SHI Department of Computer Scence and Technology, East Chna Normal Unversty 500 Dongchuan Road, Shangha 200241, P. R. Chna E-MAIL: slsun@cs.ecnu.edu.cn,

More information

Classifier Selection Based on Data Complexity Measures *

Classifier Selection Based on Data Complexity Measures * Classfer Selecton Based on Data Complexty Measures * Edth Hernández-Reyes, J.A. Carrasco-Ochoa, and J.Fco. Martínez-Trndad Natonal Insttute for Astrophyscs, Optcs and Electroncs, Lus Enrque Erro No.1 Sta.

More information

R s s f. m y s. SPH3UW Unit 7.3 Spherical Concave Mirrors Page 1 of 12. Notes

R s s f. m y s. SPH3UW Unit 7.3 Spherical Concave Mirrors Page 1 of 12. Notes SPH3UW Unt 7.3 Sphercal Concave Mrrors Page 1 of 1 Notes Physcs Tool box Concave Mrror If the reflectng surface takes place on the nner surface of the sphercal shape so that the centre of the mrror bulges

More information

MULTISPECTRAL IMAGES CLASSIFICATION BASED ON KLT AND ATR AUTOMATIC TARGET RECOGNITION

MULTISPECTRAL IMAGES CLASSIFICATION BASED ON KLT AND ATR AUTOMATIC TARGET RECOGNITION MULTISPECTRAL IMAGES CLASSIFICATION BASED ON KLT AND ATR AUTOMATIC TARGET RECOGNITION Paulo Quntlano 1 & Antono Santa-Rosa 1 Federal Polce Department, Brasla, Brazl. E-mals: quntlano.pqs@dpf.gov.br and

More information

NAG Fortran Library Chapter Introduction. G10 Smoothing in Statistics

NAG Fortran Library Chapter Introduction. G10 Smoothing in Statistics Introducton G10 NAG Fortran Lbrary Chapter Introducton G10 Smoothng n Statstcs Contents 1 Scope of the Chapter... 2 2 Background to the Problems... 2 2.1 Smoothng Methods... 2 2.2 Smoothng Splnes and Regresson

More information

Discriminative Dictionary Learning with Pairwise Constraints

Discriminative Dictionary Learning with Pairwise Constraints Dscrmnatve Dctonary Learnng wth Parwse Constrants Humn Guo Zhuoln Jang LARRY S. DAVIS UNIVERSITY OF MARYLAND Nov. 6 th, Outlne Introducton/motvaton Dctonary Learnng Dscrmnatve Dctonary Learnng wth Parwse

More information

Sum of Linear and Fractional Multiobjective Programming Problem under Fuzzy Rules Constraints

Sum of Linear and Fractional Multiobjective Programming Problem under Fuzzy Rules Constraints Australan Journal of Basc and Appled Scences, 2(4): 1204-1208, 2008 ISSN 1991-8178 Sum of Lnear and Fractonal Multobjectve Programmng Problem under Fuzzy Rules Constrants 1 2 Sanjay Jan and Kalash Lachhwan

More information

Geometry-aware Metric Learning

Geometry-aware Metric Learning Zhengdong Lu LUZ@CS.UTEXAS.EDU Prateek Jan PJAIN@CS.UTEXAS.EDU Inderjt S. Dhllon INDERJIT@CS.UTEXAS.EDU Dept. of Computer Scence, Unversty of Texas at Austn, Unversty Staton C5, Austn, TX 78712 Abstract

More information

Structure from Motion

Structure from Motion Structure from Moton Structure from Moton For now, statc scene and movng camera Equvalentl, rgdl movng scene and statc camera Lmtng case of stereo wth man cameras Lmtng case of multvew camera calbraton

More information

Transductive Regression Piloted by Inter-Manifold Relations

Transductive Regression Piloted by Inter-Manifold Relations Huan Wang IE, The Chnese Unversty of Hong Kong, Hong Kong Shucheng Yan Thomas Huang ECE, Unversty of Illnos at Urbana Champagn, USA Janzhuang Lu Xaoou Tang IE, The Chnese Unversty of Hong Kong, Hong Kong

More information

Parallelism for Nested Loops with Non-uniform and Flow Dependences

Parallelism for Nested Loops with Non-uniform and Flow Dependences Parallelsm for Nested Loops wth Non-unform and Flow Dependences Sam-Jn Jeong Dept. of Informaton & Communcaton Engneerng, Cheonan Unversty, 5, Anseo-dong, Cheonan, Chungnam, 330-80, Korea. seong@cheonan.ac.kr

More information

Math Homotopy Theory Additional notes

Math Homotopy Theory Additional notes Math 527 - Homotopy Theory Addtonal notes Martn Frankland February 4, 2013 The category Top s not Cartesan closed. problem. In these notes, we explan how to remedy that 1 Compactly generated spaces Ths

More information

SLAM Summer School 2006 Practical 2: SLAM using Monocular Vision

SLAM Summer School 2006 Practical 2: SLAM using Monocular Vision SLAM Summer School 2006 Practcal 2: SLAM usng Monocular Vson Javer Cvera, Unversty of Zaragoza Andrew J. Davson, Imperal College London J.M.M Montel, Unversty of Zaragoza. josemar@unzar.es, jcvera@unzar.es,

More information

2x x l. Module 3: Element Properties Lecture 4: Lagrange and Serendipity Elements

2x x l. Module 3: Element Properties Lecture 4: Lagrange and Serendipity Elements Module 3: Element Propertes Lecture : Lagrange and Serendpty Elements 5 In last lecture note, the nterpolaton functons are derved on the bass of assumed polynomal from Pascal s trangle for the fled varable.

More information

Solitary and Traveling Wave Solutions to a Model. of Long Range Diffusion Involving Flux with. Stability Analysis

Solitary and Traveling Wave Solutions to a Model. of Long Range Diffusion Involving Flux with. Stability Analysis Internatonal Mathematcal Forum, Vol. 6,, no. 7, 8 Soltary and Travelng Wave Solutons to a Model of Long Range ffuson Involvng Flux wth Stablty Analyss Manar A. Al-Qudah Math epartment, Rabgh Faculty of

More information

Accounting for the Use of Different Length Scale Factors in x, y and z Directions

Accounting for the Use of Different Length Scale Factors in x, y and z Directions 1 Accountng for the Use of Dfferent Length Scale Factors n x, y and z Drectons Taha Soch (taha.soch@kcl.ac.uk) Imagng Scences & Bomedcal Engneerng, Kng s College London, The Rayne Insttute, St Thomas Hosptal,

More information

Multi-stable Perception. Necker Cube

Multi-stable Perception. Necker Cube Mult-stable Percepton Necker Cube Spnnng dancer lluson, Nobuuk Kaahara Fttng and Algnment Computer Vson Szelsk 6.1 James Has Acknowledgment: Man sldes from Derek Hoem, Lana Lazebnk, and Grauman&Lebe 2008

More information

Semi-Supervised Kernel Mean Shift Clustering

Semi-Supervised Kernel Mean Shift Clustering IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, VOL. XX, NO. XX, JANUARY XXXX 1 Sem-Supervsed Kernel Mean Shft Clusterng Saket Anand, Student Member, IEEE, Sushl Mttal, Member, IEEE, Oncel

More information

Lecture #15 Lecture Notes

Lecture #15 Lecture Notes Lecture #15 Lecture Notes The ocean water column s very much a 3-D spatal entt and we need to represent that structure n an economcal way to deal wth t n calculatons. We wll dscuss one way to do so, emprcal

More information

Image Alignment CSC 767

Image Alignment CSC 767 Image Algnment CSC 767 Image algnment Image from http://graphcs.cs.cmu.edu/courses/15-463/2010_fall/ Image algnment: Applcatons Panorama sttchng Image algnment: Applcatons Recognton of object nstances

More information

Range images. Range image registration. Examples of sampling patterns. Range images and range surfaces

Range images. Range image registration. Examples of sampling patterns. Range images and range surfaces Range mages For many structured lght scanners, the range data forms a hghly regular pattern known as a range mage. he samplng pattern s determned by the specfc scanner. Range mage regstraton 1 Examples

More information

Unsupervised Learning and Clustering

Unsupervised Learning and Clustering Unsupervsed Learnng and Clusterng Why consder unlabeled samples?. Collectng and labelng large set of samples s costly Gettng recorded speech s free, labelng s tme consumng 2. Classfer could be desgned

More information

Flatten a Curved Space by Kernel: From Einstein to Euclid

Flatten a Curved Space by Kernel: From Einstein to Euclid Flatten a Curved Space by Kernel: From Ensten to Eucld Quyuan Huang, Dapeng Olver Wu Ensten s general theory of relatvty fundamentally changed our vew about the physcal world. Dfferent from Newton s theory,

More information

Lecture 5: Multilayer Perceptrons

Lecture 5: Multilayer Perceptrons Lecture 5: Multlayer Perceptrons Roger Grosse 1 Introducton So far, we ve only talked about lnear models: lnear regresson and lnear bnary classfers. We noted that there are functons that can t be represented

More information

A Robust Method for Estimating the Fundamental Matrix

A Robust Method for Estimating the Fundamental Matrix Proc. VIIth Dgtal Image Computng: Technques and Applcatons, Sun C., Talbot H., Ourseln S. and Adraansen T. (Eds.), 0- Dec. 003, Sydney A Robust Method for Estmatng the Fundamental Matrx C.L. Feng and Y.S.

More information

NUMERICAL SOLVING OPTIMAL CONTROL PROBLEMS BY THE METHOD OF VARIATIONS

NUMERICAL SOLVING OPTIMAL CONTROL PROBLEMS BY THE METHOD OF VARIATIONS ARPN Journal of Engneerng and Appled Scences 006-017 Asan Research Publshng Network (ARPN). All rghts reserved. NUMERICAL SOLVING OPTIMAL CONTROL PROBLEMS BY THE METHOD OF VARIATIONS Igor Grgoryev, Svetlana

More information

Locality Preserving Clustering for Image Database

Locality Preserving Clustering for Image Database Localt Preservng Clusterng for Image Database Xn Zheng 1 *, Deng Ca*, Xaofe He 2 *, We-Yng Ma*, Xuen Ln 1 *Mcrosoft Research Asa Bejng, Chna ca_deng@ahoo.com wma@mcrosoft.com 1 Ke Lab of Pervasve Computng,

More information

A New Approach For the Ranking of Fuzzy Sets With Different Heights

A New Approach For the Ranking of Fuzzy Sets With Different Heights New pproach For the ankng of Fuzzy Sets Wth Dfferent Heghts Pushpnder Sngh School of Mathematcs Computer pplcatons Thapar Unversty, Patala-7 00 Inda pushpndersnl@gmalcom STCT ankng of fuzzy sets plays

More information

Outline. Self-Organizing Maps (SOM) US Hebbian Learning, Cntd. The learning rule is Hebbian like:

Outline. Self-Organizing Maps (SOM) US Hebbian Learning, Cntd. The learning rule is Hebbian like: Self-Organzng Maps (SOM) Turgay İBRİKÇİ, PhD. Outlne Introducton Structures of SOM SOM Archtecture Neghborhoods SOM Algorthm Examples Summary 1 2 Unsupervsed Hebban Learnng US Hebban Learnng, Cntd 3 A

More information

AP PHYSICS B 2008 SCORING GUIDELINES

AP PHYSICS B 2008 SCORING GUIDELINES AP PHYSICS B 2008 SCORING GUIDELINES General Notes About 2008 AP Physcs Scorng Gudelnes 1. The solutons contan the most common method of solvng the free-response questons and the allocaton of ponts for

More information

EXTENDED BIC CRITERION FOR MODEL SELECTION

EXTENDED BIC CRITERION FOR MODEL SELECTION IDIAP RESEARCH REPORT EXTEDED BIC CRITERIO FOR ODEL SELECTIO Itshak Lapdot Andrew orrs IDIAP-RR-0-4 Dalle olle Insttute for Perceptual Artfcal Intellgence P.O.Box 59 artgny Valas Swtzerland phone +4 7

More information

Proper Choice of Data Used for the Estimation of Datum Transformation Parameters

Proper Choice of Data Used for the Estimation of Datum Transformation Parameters Proper Choce of Data Used for the Estmaton of Datum Transformaton Parameters Hakan S. KUTOGLU, Turkey Key words: Coordnate systems; transformaton; estmaton, relablty. SUMMARY Advances n technologes and

More information

GSLM Operations Research II Fall 13/14

GSLM Operations Research II Fall 13/14 GSLM 58 Operatons Research II Fall /4 6. Separable Programmng Consder a general NLP mn f(x) s.t. g j (x) b j j =. m. Defnton 6.. The NLP s a separable program f ts objectve functon and all constrants are

More information

A NOTE ON FUZZY CLOSURE OF A FUZZY SET

A NOTE ON FUZZY CLOSURE OF A FUZZY SET (JPMNT) Journal of Process Management New Technologes, Internatonal A NOTE ON FUZZY CLOSURE OF A FUZZY SET Bhmraj Basumatary Department of Mathematcal Scences, Bodoland Unversty, Kokrajhar, Assam, Inda,

More information

Learning a Locality Preserving Subspace for Visual Recognition

Learning a Locality Preserving Subspace for Visual Recognition Learnng a Localty Preservng Subspace for Vsual Recognton Xaofe He *, Shucheng Yan #, Yuxao Hu, and Hong-Jang Zhang Mcrosoft Research Asa, Bejng 100080, Chna * Department of Computer Scence, Unversty of

More information

The Greedy Method. Outline and Reading. Change Money Problem. Greedy Algorithms. Applications of the Greedy Strategy. The Greedy Method Technique

The Greedy Method. Outline and Reading. Change Money Problem. Greedy Algorithms. Applications of the Greedy Strategy. The Greedy Method Technique //00 :0 AM Outlne and Readng The Greedy Method The Greedy Method Technque (secton.) Fractonal Knapsack Problem (secton..) Task Schedulng (secton..) Mnmum Spannng Trees (secton.) Change Money Problem Greedy

More information

Announcements. Supervised Learning

Announcements. Supervised Learning Announcements See Chapter 5 of Duda, Hart, and Stork. Tutoral by Burge lnked to on web page. Supervsed Learnng Classfcaton wth labeled eamples. Images vectors n hgh-d space. Supervsed Learnng Labeled eamples

More information

Graph-based Clustering

Graph-based Clustering Graphbased Clusterng Transform the data nto a graph representaton ertces are the data ponts to be clustered Edges are eghted based on smlarty beteen data ponts Graph parttonng Þ Each connected component

More information

Classification of Face Images Based on Gender using Dimensionality Reduction Techniques and SVM

Classification of Face Images Based on Gender using Dimensionality Reduction Techniques and SVM Classfcaton of Face Images Based on Gender usng Dmensonalty Reducton Technques and SVM Fahm Mannan 260 266 294 School of Computer Scence McGll Unversty Abstract Ths report presents gender classfcaton based

More information

Determining the Optimal Bandwidth Based on Multi-criterion Fusion

Determining the Optimal Bandwidth Based on Multi-criterion Fusion Proceedngs of 01 4th Internatonal Conference on Machne Learnng and Computng IPCSIT vol. 5 (01) (01) IACSIT Press, Sngapore Determnng the Optmal Bandwdth Based on Mult-crteron Fuson Ha-L Lang 1+, Xan-Mn

More information

Optimal Workload-based Weighted Wavelet Synopses

Optimal Workload-based Weighted Wavelet Synopses Optmal Workload-based Weghted Wavelet Synopses Yoss Matas School of Computer Scence Tel Avv Unversty Tel Avv 69978, Israel matas@tau.ac.l Danel Urel School of Computer Scence Tel Avv Unversty Tel Avv 69978,

More information

Human Face Recognition Using Generalized. Kernel Fisher Discriminant

Human Face Recognition Using Generalized. Kernel Fisher Discriminant Human Face Recognton Usng Generalzed Kernel Fsher Dscrmnant ng-yu Sun,2 De-Shuang Huang Ln Guo. Insttute of Intellgent Machnes, Chnese Academy of Scences, P.O.ox 30, Hefe, Anhu, Chna. 2. Department of

More information

High-Boost Mesh Filtering for 3-D Shape Enhancement

High-Boost Mesh Filtering for 3-D Shape Enhancement Hgh-Boost Mesh Flterng for 3-D Shape Enhancement Hrokazu Yagou Λ Alexander Belyaev y Damng We z Λ y z ; ; Shape Modelng Laboratory, Unversty of Azu, Azu-Wakamatsu 965-8580 Japan y Computer Graphcs Group,

More information

Computer Animation and Visualisation. Lecture 4. Rigging / Skinning

Computer Animation and Visualisation. Lecture 4. Rigging / Skinning Computer Anmaton and Vsualsaton Lecture 4. Rggng / Sknnng Taku Komura Overvew Sknnng / Rggng Background knowledge Lnear Blendng How to decde weghts? Example-based Method Anatomcal models Sknnng Assume

More information

PCA Based Gait Segmentation

PCA Based Gait Segmentation Honggu L, Cupng Sh & Xngguo L PCA Based Gat Segmentaton PCA Based Gat Segmentaton Honggu L, Cupng Sh, and Xngguo L 2 Electronc Department, Physcs College, Yangzhou Unversty, 225002 Yangzhou, Chna 2 Department

More information

Image segmentation by using the localized subspace iteration algorithm

Image segmentation by using the localized subspace iteration algorithm Image segmentaton by usng the localzed subspace teraton algorthm Jnlong Wu and Tejun L May 8, 28 Abstract An mage segmentaton algorthm called segmentaton based on the localzed subspace teratons (SLSI)

More information

Parallel Numerics. 1 Preconditioning & Iterative Solvers (From 2016)

Parallel Numerics. 1 Preconditioning & Iterative Solvers (From 2016) Technsche Unverstät München WSe 6/7 Insttut für Informatk Prof. Dr. Thomas Huckle Dpl.-Math. Benjamn Uekermann Parallel Numercs Exercse : Prevous Exam Questons Precondtonng & Iteratve Solvers (From 6)

More information

Mercer Kernels for Object Recognition with Local Features

Mercer Kernels for Object Recognition with Local Features TR004-50, October 004, Department of Computer Scence, Dartmouth College Mercer Kernels for Object Recognton wth Local Features Swe Lyu Department of Computer Scence Dartmouth College Hanover NH 03755 A

More information

CS434a/541a: Pattern Recognition Prof. Olga Veksler. Lecture 15

CS434a/541a: Pattern Recognition Prof. Olga Veksler. Lecture 15 CS434a/541a: Pattern Recognton Prof. Olga Veksler Lecture 15 Today New Topc: Unsupervsed Learnng Supervsed vs. unsupervsed learnng Unsupervsed learnng Net Tme: parametrc unsupervsed learnng Today: nonparametrc

More information

Two-Dimensional Supervised Discriminant Projection Method For Feature Extraction

Two-Dimensional Supervised Discriminant Projection Method For Feature Extraction Appl. Math. Inf. c. 6 No. pp. 8-85 (0) Appled Mathematcs & Informaton cences An Internatonal Journal @ 0 NP Natural cences Publshng Cor. wo-dmensonal upervsed Dscrmnant Proecton Method For Feature Extracton

More information

Assignment # 2. Farrukh Jabeen Algorithms 510 Assignment #2 Due Date: June 15, 2009.

Assignment # 2. Farrukh Jabeen Algorithms 510 Assignment #2 Due Date: June 15, 2009. Farrukh Jabeen Algorthms 51 Assgnment #2 Due Date: June 15, 29. Assgnment # 2 Chapter 3 Dscrete Fourer Transforms Implement the FFT for the DFT. Descrbed n sectons 3.1 and 3.2. Delverables: 1. Concse descrpton

More information

New l 1 -Norm Relaxations and Optimizations for Graph Clustering

New l 1 -Norm Relaxations and Optimizations for Graph Clustering Proceedngs of the Thrteth AAAI Conference on Artfcal Intellgence (AAAI-6) New l -Norm Relaxatons and Optmzatons for Graph Clusterng Fepng Ne, Hua Wang, Cheng Deng 3, Xnbo Gao 3, Xuelong L 4, Heng Huang

More information

A B-Snake Model Using Statistical and Geometric Information - Applications to Medical Images

A B-Snake Model Using Statistical and Geometric Information - Applications to Medical Images A B-Snake Model Usng Statstcal and Geometrc Informaton - Applcatons to Medcal Images Yue Wang, Eam Khwang Teoh and Dnggang Shen 2 School of Electrcal and Electronc Engneerng, Nanyang Technologcal Unversty

More information

On Incremental and Robust Subspace Learning

On Incremental and Robust Subspace Learning On Incremental and Robust Subspace Learnng Yongmn L, L-Qun Xu, Jason Morphett and Rchard Jacobs Content and Codng Lab, BT Exact pp1 MLB3/7, Oron Buldng, Adastral Park, Ipswch, IP5 3RE, UK Emal: Yongmn.L@bt.com

More information

Face Recognition Based on SVM and 2DPCA

Face Recognition Based on SVM and 2DPCA Vol. 4, o. 3, September, 2011 Face Recognton Based on SVM and 2DPCA Tha Hoang Le, Len Bu Faculty of Informaton Technology, HCMC Unversty of Scence Faculty of Informaton Scences and Engneerng, Unversty

More information

A Workflow for Spatial Uncertainty Quantification using Distances and Kernels

A Workflow for Spatial Uncertainty Quantification using Distances and Kernels A Workflow for Spatal Uncertanty Quantfcaton usng Dstances and Kernels Célne Schedt and Jef Caers Stanford Center for Reservor Forecastng Stanford Unversty Abstract Assessng uncertanty n reservor performance

More information

Cluster Analysis of Electrical Behavior

Cluster Analysis of Electrical Behavior Journal of Computer and Communcatons, 205, 3, 88-93 Publshed Onlne May 205 n ScRes. http://www.scrp.org/ournal/cc http://dx.do.org/0.4236/cc.205.350 Cluster Analyss of Electrcal Behavor Ln Lu Ln Lu, School

More information

An Application of the Dulmage-Mendelsohn Decomposition to Sparse Null Space Bases of Full Row Rank Matrices

An Application of the Dulmage-Mendelsohn Decomposition to Sparse Null Space Bases of Full Row Rank Matrices Internatonal Mathematcal Forum, Vol 7, 2012, no 52, 2549-2554 An Applcaton of the Dulmage-Mendelsohn Decomposton to Sparse Null Space Bases of Full Row Rank Matrces Mostafa Khorramzadeh Department of Mathematcal

More information

For instance, ; the five basic number-sets are increasingly more n A B & B A A = B (1)

For instance, ; the five basic number-sets are increasingly more n A B & B A A = B (1) Secton 1.2 Subsets and the Boolean operatons on sets If every element of the set A s an element of the set B, we say that A s a subset of B, or that A s contaned n B, or that B contans A, and we wrte A

More information

Term Weighting Classification System Using the Chi-square Statistic for the Classification Subtask at NTCIR-6 Patent Retrieval Task

Term Weighting Classification System Using the Chi-square Statistic for the Classification Subtask at NTCIR-6 Patent Retrieval Task Proceedngs of NTCIR-6 Workshop Meetng, May 15-18, 2007, Tokyo, Japan Term Weghtng Classfcaton System Usng the Ch-square Statstc for the Classfcaton Subtask at NTCIR-6 Patent Retreval Task Kotaro Hashmoto

More information

Helsinki University Of Technology, Systems Analysis Laboratory Mat Independent research projects in applied mathematics (3 cr)

Helsinki University Of Technology, Systems Analysis Laboratory Mat Independent research projects in applied mathematics (3 cr) Helsnk Unversty Of Technology, Systems Analyss Laboratory Mat-2.08 Independent research projects n appled mathematcs (3 cr) "! #$&% Antt Laukkanen 506 R ajlaukka@cc.hut.f 2 Introducton...3 2 Multattrbute

More information

Semi-supervised Classification Using Local and Global Regularization

Semi-supervised Classification Using Local and Global Regularization Proceedngs of the Twenty-Thrd AAAI Conference on Artfcal Intellgence (2008) Sem-supervsed Classfcaton Usng Local and Global Regularzaton Fe Wang 1, Tao L 2, Gang Wang 3, Changshu Zhang 1 1 Department of

More information

Mathematics 256 a course in differential equations for engineering students

Mathematics 256 a course in differential equations for engineering students Mathematcs 56 a course n dfferental equatons for engneerng students Chapter 5. More effcent methods of numercal soluton Euler s method s qute neffcent. Because the error s essentally proportonal to the

More information

IMAGE matting is an important but still challenging problem

IMAGE matting is an important but still challenging problem IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS 1 Patch Algnment Manfold Mattng Xuelong L, Fellow, IEEE, Kang Lu, Member, IEEE, Yongsheng Dong, Member, IEEE, and Dacheng Tao, Fellow, IEEE arxv:1904.07588v1

More information

Fitting & Matching. Lecture 4 Prof. Bregler. Slides from: S. Lazebnik, S. Seitz, M. Pollefeys, A. Effros.

Fitting & Matching. Lecture 4 Prof. Bregler. Slides from: S. Lazebnik, S. Seitz, M. Pollefeys, A. Effros. Fttng & Matchng Lecture 4 Prof. Bregler Sldes from: S. Lazebnk, S. Setz, M. Pollefeys, A. Effros. How do we buld panorama? We need to match (algn) mages Matchng wth Features Detect feature ponts n both

More information

Shape Representation Robust to the Sketching Order Using Distance Map and Direction Histogram

Shape Representation Robust to the Sketching Order Using Distance Map and Direction Histogram Shape Representaton Robust to the Sketchng Order Usng Dstance Map and Drecton Hstogram Department of Computer Scence Yonse Unversty Kwon Yun CONTENTS Revew Topc Proposed Method System Overvew Sketch Normalzaton

More information

Tsinghua University at TAC 2009: Summarizing Multi-documents by Information Distance

Tsinghua University at TAC 2009: Summarizing Multi-documents by Information Distance Tsnghua Unversty at TAC 2009: Summarzng Mult-documents by Informaton Dstance Chong Long, Mnle Huang, Xaoyan Zhu State Key Laboratory of Intellgent Technology and Systems, Tsnghua Natonal Laboratory for

More information

Region Segmentation Readings: Chapter 10: 10.1 Additional Materials Provided

Region Segmentation Readings: Chapter 10: 10.1 Additional Materials Provided Regon Segmentaton Readngs: hater 10: 10.1 Addtonal Materals Provded K-means lusterng tet EM lusterng aer Grah Parttonng tet Mean-Shft lusterng aer 1 Image Segmentaton Image segmentaton s the oeraton of

More information