Semi-Supervised Discriminant Analysis Based On Data Structure

Size: px
Start display at page:

Download "Semi-Supervised Discriminant Analysis Based On Data Structure"

Transcription

1 IOSR Journal of Computer Engneerng (IOSR-JCE) e-issn: ,p-ISSN: , Volume 17, Issue 3, Ver. VII (May Jun. 2015), PP Sem-Supervsed Dscrmnant Analyss Based On Data Structure Xuesong Yn *, Rongrong Jang, Lfe Jang Department of Computer Scence & echnology, Zheang Rado & V Unversty, Hangzhou, , Chna Abstract: Dmensonalty reducton s a key data-analytc technque for mnng hgh-dmensonal data. In ths paper, we consder a general problem of learnng from parwse constrants n the form of must-lnk and cannotlnk. As one knd of sde nformaton, the must-lnk constrants mply that a par of nstances belongs to the same class, whle the cannot-lnk constrants compel them to be dfferent classes. Gven must-lnk and cannot-lnk nformaton, the goal of ths paper s learn a smooth and dscrmnatve subspace. Specfcally, n order to generate such a subspace, we use parwse constrants to present an optmzaton problem, n whch a least squares formulaton that ntegrates both global and local structures s consdered as a regularzaton term for dmensonalty reducton. Expermental results on benchmark data sets show the effectveness of the proposed algorthm. Keywords: Dmensonalty reducton; Parwse constrants; Hgh-dmensonal data; Dscrmnant analyss. I. Introducton Applcatons n varous domans such as face recognton, web mnng and mage retreval often lead to very hgh-dmensonal data. Mnng and understandng such hgh-dmensonal data s a contemporary challenge due to the curse of dmensonalty. However, the current researches [3,4,5,6,13] show that reducng the dmensonalty of data has been a feasble method for more precsely characterng such data. herefore, dmensonalty reducton (DR for short) makes sense n many practcal problems [2,3,5] snce t allows one to represent the data n a lower dmensonal space. Based on the avalablty of the sde nformaton, dmensonalty reducton algorthms fall nto three categores: supervsed DR [7,12,28,35], sem-supervsed DR [8,16,18,21,22] and unsupervsed DR [10,13,14,15,34]. In ths paper, we focus on the case of sem-supervsed DR. Wth few constrants or class label nformaton, exstng sem-supervsed DR algorthms appeal to proectng the observed data onto a low-dmensonal manfold, where the margn between data form dfferent classes s maxmzed. Most algorthms n ths category, such as Localty Senstve Dscrmnant Analyss [20], Localty Senstve Sem-supervsed Feature Selecton [36], and Sem-supervsed Dscrmnant Analyss [26], greatly take the ntrnsc geometrc structure nto account,.e. local structure of the data, ndcatng that nearby nstances are lkely to have the same label. However, global structure of the data, mplyng that nstances on the same structure (typcally referred to as a cluster or a manfold) are lke to have the same label [1], s gnored n those algorthms. In fact, when mnng the data, global structure s the same sgnfcance as local structure [1,10,11,29]. herefore, t s necessary to ntegrate both global and local structures nto the process of DR. he key challenge s how we can ncorporate relatvely few parwse constrants nto DR such that the represented data can stll capture the avalable class nformaton. o ths end, we propose a sem-supervsed dscrmnant analyss algorthm wth ntegratng both global and local structures (SGLS), whch naturally address DR under sem-supervsed settngs. Gven parwse constrants, the key dea n SGLS s to ntegrate both global and local structures n sem-supervsed dscrmnant framework so that both dscrmnant and geometrc structures of the data can be accurately captured. he SGLS algorthm has the same flavor as not only supervsed DR algorthms, whch try to adust the dstance among nstances to mprove the separablty of the data, but unsupervsed DR algorthms as well, whch make nearby nstances n the orgnal space close to each other n the embeddng space. he remander of the paper s organzed as follows. In Secton 2, related work on the exstng algorthms of sem-supervsed DR s dscussed. In Secton 3, we ntroduce the general framework of our proposed SGLS algorthm. Secton 4 dscusses the expermental results on a number of real-world data sets. Fnally, we draw conclusons n Secton 5. II. Related Work Unsupervsed DR has attracted consderable nterest n the last decade [9,19,23,25,29]. Recently, a few researchers [5,9,13,17] have consdered the case that dscovers the local geometrcal structure of the data manfold when the data le n a low-dmensonal space of the hgh-dmensonal ambent space. Algorthms exploted along ths lne are used to estmate geometrcal and topologcal propertes of the embeddng space, *Correspondng author: Xuesong Yn, ynxs@nuaa.edu.cn DOI: / Page

2 such as Localty Preservng Proectons (LPP) [34], Local Dscrmnant Embeddng (LDE) [35], Margnal Fsher Analyss (MFA) [37] and ransductve Component Analyss (CA) [27]. Dfferent wth typcal unsupervsed DR algorthms, more lately, K-means was appled n the dmenson-reduced space for avodng the curse of dmensonalty n several algorthms [25,31] whch perform clusterng and dmensonalty reducton smultaneously. Lnear Dscrmnant Analyss (LDA) [7,12], capturng the global geometrc structure of the data by smultaneously maxmzng the between-class dstance and mnmzng the wthn-class dstance, s a wellknown approach for supervsed DR. It has been used wdely n varous applcatons, such as face recognton [5]. However, a maor shortcomng of LDA s that t fals to dscover the local geometrcal structure of the data manfold. Sem-supervsed DR can be consdered as a new ssue n DR area, whch learns from a combnaton of both the sde nformaton and unlabeled data. In general, the sde nformaton can be expressed n dverse forms, such as class labels and parwse constrants. Parwse constrants nclude both must-lnk form n whch the nstances belong to the same class and cannot-lnk form n whch the nstances belong to dfferent classes. Some sem-supervsed DR algorthms [20,26,27,30,36,37] ust appled partly labeled nstances to maxmze the margn between nstances form dfferent classes for mprovng performance. However, n many practcal data mnng applcatons, t s farly expensve to label nstances. In contrast, t could be easer for an expert or a user to specfy whether some pars of nstances belong to the same class or not. Moreover, the parwse constrants can be derved from labeled data but not vce versa. herefore, makng use of parwse constrants to reduce the dmensonalty of the data has been an mportant topc [8,16,33]. III. he SGLS Algorthm 3.1 he Integratng Global and Local structures Framework N Gven a set of nstances x 1, x n R, we can use a k-nearest neghbor graph to model the relatonshp, n between nearby data ponts. Specfcally, we put an edge between nodes and f x and x are close,.e. x and x are among k nearest neghbors of each other. Let the correspondng weght matrx be S, defned by S 2 x x t 0 otherwse exp( ) x ( ) ( ) 2 Nk x or x Nk x (1) Where Nk( x) denotes the set of k nearest neghbors of x. In general, f a weghted value S s large, x and x are close to each other n the orgnal space. hus, n order to preserve the localty structure, the embeddng space obtaned should be as smooth as possble. o learn an approprate representaton y ( y [ y1,, y ] n ) wth the localty structure, t s common to mnmze the followng obectve functon [13,34]: 2 y 2 y S y Ly (2) where D s a dagonal matrx whose dagonal entres are column (or row, snce S s symmetrc) sum of S, that s D n S. L D S s the Laplacan matrx. 1 On the other hand, we expect that the proectons{ ax } generated by a proecton vector, a R d, whch span the fnal embeddng space, approach the suffcently smooth embeddngs{ y } as much as possble. hs expectaton can be satsfed by the least squares formulaton whch can ncorporate the local structure nformaton va a regularzaton term defned as n Eq. (2). Mathematcally, a proecton vector a can be acqured by solvng the followng optmzaton problem [27]: 2 mn J( a, y) X a y y Ly (3) where 1 0 s the trade-off parameter. Here we assume that the data matrx X has been centered. akng the dervatve of J(a,y) wth respect to y and settng t equals to zero, we get 1 DOI: / Page

3 It follows that Sem-supervsed Dscrmnant Analyss based on Data Structure J ( a, y) 2( y X a 1Ly ) 0 y 1 y* ( I 1L) X a (4) (5) where I s the dentty matrx of sze n. Applyng Eq. (5) to Eq. (3), we can elmnate y and obtan the followng optmzaton problem wth regard to a: mn J ( a) J( a, y*) y* ( L)( I L) y * a X ( I L) ( L) X a a XGX a where G s a postve sem-defnte matrx of sze n by n and has the followng expresson: (6) G ( I L) ( L) (7) We can observe from Eq. (6) that mnmzng J wll brng the smooth subspace where both global and local structures of data are calculated wth regard to a. 3.2 Dscrmnat Analyss Wth Parwse Constrants Suppose we are gven two sets of parwse constrants ncludng must-lnk (M) and cannot-lnk (C). In the prevous work [32,38,40,41], such constrants were used for learnng an adaptve metrc between the prototypes of nstances. However, recent research has shown that the dstance metrc learned on hghdmensonal space s relatvely unrelable [33,39]. Instead of usng constrant-guded metrc learnng, n ths paper our goal s to use parwse constrants to fnd a proecton vector such that n the embeddng space the dstance between any par of nstances nvolved n the cannot-lnk constrants s maxmzed whle that between any par of nstances nvolved must-lnk constrants s mnmzed. If the proecton vector a s acqured, we can ntroduce such a transformaton: y a x (8) Under ths transformaton, we calculate the followng sum of the squared dstances of the pont pars n M: dw ( a x a x ) ( a x a x ) ( x, x ) M (9) a S a where S s the covarance matrx of the pont pars n M: w Correspondngly, for the pont pars n C, we have where S has the followng expresson form: b w S ( x x )( x x ) (10) w ( x, x ) M d b (11) a Sba S ( x x )( x x ) (12) b ( x, x ) C Smlar to LDA that seeks drectons on whch the data ponts of dfferent classes are far from each other whle requrng data ponts of the same class to be close to each other, we try to maxmze d b and mnmze d w. hus, we can optmze the followng problem to obtan the proecton drecton: a Sba max (13) a a S a w DOI: / Page

4 3.3 he Optmzaton Framework So far, we pursue the proecton drecton a from two formulatons: (1) we ntroduce a least squares formulaton for fndng a, whch facltes the ntegraton of global and local structures; (2) we obtan the proecton vector by maxmzng the cannot-lnk covarance and smultaneously mnmzng the must-lnk covarance. When the frst formulaton s consdered as a regularzaton term, we arrve at the followng optmzaton problem: a Sba max (14) a (1 ) a S a J ( a ) 2 w 2 where 0 2 1controls the smoothness of estmator. he optmal proecton vector a that maxmzes the obectve functon (14) s acqured by the maxmum egenvalue soluton to the followng generalzed egenvalue problem. S a ((1 ) S XGX ) a (15) b 2 w 2 In real-world applcatons, we usually need more proecton vectors to span a subspace rather than ust one. Let the matrx characterng such a target subspace be A [ a1,, a d ] whch s formed by the egenvectors assocated wth the d largest egenvalues of Eq. (15). Also, t s worth notng that snce our method s lnear, t Ax. has a drect out-of-sample extenson to novel sample x,.e. Based on the analyss above, we propose to develop a sem-supervsed dmensonalty reducton algorthm whch ntegrates both global and local structures defned n Eq. (14). he correspondng algorthm, called SGLS s presented n Algorthm 1. Algorthm 1 Input: X, M, C. N d Output: A proecton matrx A R. 1 Construct an unsupervsed graph on all n nstances to calculate S as n Eq. (2) and L = D S. 2 Present a least squares formulaton together wth the local structure nformaton to compute matrx G as n Eq. (7). 3 Compute Sw and Sb as n Eqs. (10) and (12). 4 Compute a generalzed egenvalue problem Eq. (15) and let the solved egenvectors be a,, 1 ad order of decreasng egenvalues. A [ a,, a d ]. 5 Output 1 o get a stable soluton of the egen-problem n Eq. (15), the matrx s requred to be non-sngular whch s not true when the number of features s larger than the number of nstances,.e. N > n. In such case, we can apply the khonov regularzaton technque [24,26] to mprove the estmaton. hus, the generalzed egenproblem n ths paper can be rewrtten as: n an S a ((1 ) S XGX I ) a (16) b 2 w 2 N where I N s the dentty matrx of sze N and 0 s a regularzaton parameter. IV. Experments In ths secton, we study the performance of our SGLS algorthm n terms of clusterng accuracy on several real-world data sets from dfferent applcaton domans. We set k = 5 and t = 1 for the constructon of the smlarty matrx S defned n Eq. (1). 4.1 Expermental Setup In order to verfy the effcacy of the proposed algorthm, we compare t wth the state-of-the-art, semsupervsed dmensonalty reducton algorthms ncludng SSDR [22], SCREEN [33], LADKM [31] and LMDM [16]. We mplemented SGLS n the MALAB envronment. All experments were conducted on a PENIUM DUAL 3G PC wth 2.0 GB RAM. We compared all algorthms on seven benchmark data sets, ncludng Segment and Dgts from UCI machne Learnng Repostory, three document data sets DOC1, DOC2 and DOC3 from the 20-Newgroups data, and two mage data sets : USPS handwrtten data and Yale Face B (YALEB). he statstcs of all data sets are summarzed n able 1. DOI: / Page

5 able 1: Summary of the test datasets used n our experment Data sets Instance Dmenson Class Segment Dgts DOC DOC DOC USPS YALEB As the unfed clusterng platform, we use K-means algorthm as the clusterng method to compare several algorthms after dmensonalty reducton. For each data set, we ran dfferent algorthms for 20 tmes and the comparson was based on the average performance. he parameter n SGLS are always set to 1 10 and f wthout extra explanaton. Moreover, for the far comparson, the dmensonalty of all the algorthms s set to K-1 (K s the number of clusterng). o evaluate the clusterng performance, n ths paper, we employ normalzed mutual nformaton (NMI) as the clusterng valdaton measure, whch s wdely used to verfy the performance of clusterng algorthms [8,25,33]. hen, NMI s defned as I( X, Y) NMI (17) H( X ) H( Y) where X and Y denote the random varables of cluster membershps from the ground truth and the output of clusterng algorthm, respectvely. I( X, Y) s the mutual nformaton between X and Y. H( X ) and HY ( ) are the entropes of X and Y respectvely. 4.2 Expermental Results DOI: / Page

6 Fgure 1 Clusterng performance on 6 data sets wth dfferent number of constrants able 2 presents the NMI results of varous algorthms on all seven data sets. Also, Fgure 1 shows the clusterng performance of standard K-means appled to the proected data by dfferent dmensonalty reducton algorthms wth dfferent numbers of parwse constrants. As can be seen, our SGLS algorthm acheves the best performance on Dgts, DOC1, DOC3, USPS, and YALEB data sets. In fact, the performance of our algorthm s always comparable to that of the other algorthm on the rest of all seven data sets used n ths paper. Specfcally, we can obtan the followng man observatons from able 2 and Fgure 1: (1) he SGLS algorthm sgnfcantly outperforms not only LMDM and LDAKM on all of the seven data sets used n the experment, but SCREEN and SSDR on the sx data sets as well. hs can be contrbuted to such a fact that ntegratng both global and local structures n SGLS may be benefcal. (2) It s nterestng to note that SGLS slghtly underperform both SSDR on Segment and SCREEN on DOC2. hs mples that n certan case such as Segment, global and local structures may capture smlar nformaton so that ntegratng such both structures seems to hardly help n the dmensonalty reducton process. (3) It s clear from the presented results that LDAKM performs relatvely worse that the other algorthms. hs can be explaned that LDAKM does not employ any sde nformaton to pursue the proecton drecton, whch only makes use of the abundant unlabeled nstances. hus, LDAKM has a greatly nferor performance. (4) Due to the fact that SCREEN and LMDM ust parwse constrants to learn an optmal subspace, ther performances are worse that those of SSDR and SGLS respectvely. In fact, a large number of the unlabeled data play an mportant role for sem-supervsed dmensonalty reducton [8,26,27]. (5) Snce SSDR can apply both the parwse constrants and the unlabeled data to learn a subspace, t can provde a relatvely satsfyng clusterng performance n comparson wth LDAKM, SCREEN and LMDM. Compared wth our SGLS, however, SSDR has stll provded the farly nferor performance. Actually, SSDR can be consdered as the constraned Prncpal Component Analyss (PCA). hus, t fals to preserve local neghborhood structures of the nstances n the reduced low-dmensonal space. able 2: NMI comparsons on seven data sets Data sets LDAKM SCREEN SSDR LMDM SGLS Segment Dgts DOC DOC DOC USPS YALEB V. Conclusons In ths paper, we propose a novel sem-supervsed dmensonalty reducton algorthm, called SGLS. For dmensonalty reducton and clusterng, SGLS ntegrates both global and local geometrc structures so that t can be naturally extended to deal wth the abundant labeled data, as graph Laplacan s defned on all the nstances. Compared wth the state-of-the-art dmensonalty reducton algorthms on a collecton of benchmark data sets n terms of clusterng accuracy, SGLS can always acheve a clear performance gan. Extensve experments exhbt the advantages of the novel sem-supervsed clusterng method of SGLS+K-means. DOI: / Page

7 Acknowledgments he research reported n ths paper has been partally supported by Natural Scence Foundaton of Zheang Provnce under Grant No. Y , Publc Proect of Zheang Provnce under Grant No. 2013C33087, Academc Clmbng Proect of Young Academc Leaders of Zheang Provnce No. pd References [1]. D. Zhou, O. Bousquet,.N. Lal, J. Weston, and B. Scholkopf. Learnng wth local and global consstency. In Advances n Neural Informaton Processng Systems 16, 2003, pp [2]. P. Garca-Laencna, G. Rodrguez-Bermudez, J. Roca-Dorda. Explorng dmensonalty reducton of EEG features n motor magery task classfcaton, Expert Systems wth Applcatons, 41 (11) (2014) [3]. Y. Ln,. Lu, C. Fuh Dmensonalty Reducton for Data n Multple Feature Representatons. In Advances n Neural Informaton Processng Systems 21, 2008, pp [4]. X. Zhu, Z. Ghahraman, and J. Lafferty. Sem-Supervsed Learnng usng Gaussan Felds and Harmonc Functons, Proc. wenteth Int l Conf. Machne Learnng, 2003, pp [5]. X. He, S. Yan, Y. Hu, P. Nyog, H. Zhang, Face recognton usng laplacanfaces, IEEE rans. Pattern Anal. Mach. Intell. 27 (3) (2005) [6]. S. Rowes, L. Saul. Nonlnear dmensonalty reducton by locally lnear embeddng, Scence, 290 (5500) (2000) [7]. R. O. Duda, P. E. Hart, and D. G. Stork. Pattern Classfcaton. Wley-Interscence, Hoboken, NJ, 2nd edton, [8]. X. Yn,. Shu, Q. Huang. Sem-supervsed fuzzy clusterng wth metrc learnng and entropy regularzaton. Knowledge-Based Systems, 2012, 35(11): [9]. M. Belkn and P. Nyog. Laplacan Egenmaps for Dmensonalty Reducton and Data Representaton. Neural Computaton, 15(6) (2003) [10]. J. enenbaum, V. de Slva, and J. Langford. A global geometrc framework for nonlnear dmensonalty reducton. Scence, 290(5500) (2000) [11]. F. sa. Dmensonalty reducton technques for blog vsualzaton, Expert Systems wth Applcatons, 38(3) (2011) [12]. J. Yang, A.F. Frang, J.Y. Yang, D. Zhang, Z. Jn, KPCA plus LDA: A complete kernel fsher dscrmnant framework for feature extracton and recognton, IEEE rans. Pattern Anal. Mach. Intell. 27 (2) (2005) [13]. M. Belkn, P. Nyog, Laplacan egenmaps and spectral technques for embeddng and clusterng, n: Advances n Neural Informaton Processng Systems 14, 2001, pp [14]. D. Ca, X. He, J. Han, Document clusterng usng localty preservng ndexng, IEEE rans. Knowl. Data Eng. 17 (12) (2005) [15]. J. He, L. Dng, L. Jang, Z. L, Q. Hu. Intrnsc dmensonalty estmaton based on manfold assumpton, Journal of Vsual Communcaton and Image Representaton, 25( 5)(2014) [16]. S. Xang, F. Ne, and C. Zhang. Learnng a Mahalanobs dstance metrc for data clusterng and classfcaton. Pattern Recognton, 41(12) (2008) [17]. D. Ca, X. He, J. Han, H.-J. Zhang, Orthogonal laplacanfaces for face recognton, IEEE rans. Image Process. 15 (11) (2006) [18]. Z. Zhang, M. Zhao,. Chow. Margnal sem-supervsed sub-manfold proectons wth nformatve constrants for dmensonalty reducton and recognton, Neural Networks, 36(2012) [19]. J. Chen, J. Ye, and Q. L. Integratng Global and Local Structures: A Least Squares Framework for Dmensonalty Reducton, n: IEEE Computer Socety Conference on Computer Vson and Pattern Recognton, 2007, pp [20]. D. Ca, X. He, K. Zhou, J. Han, H. Bao, Localty senstve dscrmnant analyss, n: Internatonal Jont Conference on Artfcal Intellgence, 2007, pp [21]. W. Du, K. Inoue, K. Urahama, Dmensonalty reducton for sem-supervsed face recognton, n: Proceedngs of the Fuzzy Systems and Knowledge Dscovery, 2005, pp [22]. D. Zhang, Z.-H. Zhou, S. Chen, Sem-supervsed dmensonalty reducton, n: SIAM Conference on Data Mnng (SDM), 2007, pp [23]. X. Yang, H. Fu, H. Zha, J. Barlow, Sem-supervsed nonlnear dmensonalty reducton, n: Proceedngs of the Internatonal Conference on Machne Learnng, 2006, pp [24]. M. Belkn, P. Nyog, V. Sndhwan, Manfold regularzaton: a geometrc framework for learnng from labeled and unlabeled examples, J. Mach. Learnng Res. 1 (1) (2006) [25]. J. Ye, Z. Zhao, H. Lu. Adaptve dstance metrc learnng for clusterng. In: Proc. of the Conference on Computer Vson and Pattern Recognton, 2007, pp [26]. D. Ca, X. He, J. Han, Sem-supervsed dscrmnant analyss, n: Proceedngs of the Internatonal Conference on Computer Vson, 2007, pp [27]. W. Lu, D. ao, and J. Lu. ransductve Component Analyss, n: Proceedngs of the Internatonal Conference on Data Mnng, 2008, pp [28]. P. Han, A. Jn, A. Kar. Kernel Dscrmnant Embeddng n face recognton, Journal of Vsual Communcaton and Image Representaton, 22(7) (2011) [29]. J. Ye. Least Squares Lnear Dscrmnant Analyss, n: Proceedngs of Internatonal Conference on Machne Learnng, 2007, pp [30]. X. Chen, S. Chen, H. Xue, X. Zhou. A unfed dmensonalty reducton framework for sem-pared and sem-supervsed mult-vew data, Pattern Recognton, 45(5) (2012) [31]. C. Dng,. L. Adaptve dmenson reducton usng dscrmnant analyss and K-means clusterng. In: Proceedngs of the Internatonal Conference on Machne Learnng, 2007, pp [32]. E. Xng, A. Ng, M. Jordan, S. Russell. Dstance metrc learnng wth applcaton to clusterng wth sde-nformaton. In: Advances n Neural Informaton Processng Systems 15, 2003, pp [33]. W. ang, H. Xong, S. Zhong, J. Wu. Enhancng sem-supervsed clusterng: a feature proecton perspectve. In: Proceedngs of the Internatonal Conference on Knowledge Dscovery and Data Mnng, 2007, pp [34]. X. He, P. Nyog. Localty preservng proectons, In: Advances n Neural Informaton Processng Systems 16, [35]. H. Chen, H. Chang, and. Lu. Local dscrmnant embeddng and ts varants. In: Proceedngs of the Internal Conference on Computer Vson and Pattern Recognton, 2005, pp [36]. J. Zhao, K. Lu, X. He, Localty senstve sem-supervsed feature selecton, Neurocomputng, 71 (10-12), (2008) DOI: / Page

8 [37]. S. Yan, D. Xu, B. Zhang, H.J. Zhang, Q. Yang, and S. Ln. Graph embeddng and extenson: A general framework for dmensonalty reducton. IEEE ransactons on PAMI, 29(1) (2007) [38]. B. Kuls, S. Basu, I. Dhllon, and R. Mooney. Sem-supervsed graph clusterng: a kernel approach. In: Proceedngs of the Internatonal Conference on Machne Learnng, 2005, pp [39]. D. Meng, Y. Leung, Z. Xu. Passage method for nonlnear dmensonalty reducton of data on mult-cluster manfolds, Pattern Recognton, 46 (8) (2013) [40]. M. Blenko, S. Basu, R.J. Mooney, Integratng constrants and metrc learnng n sem-supervsed clusterng, n: Proceedngs of Internatonal Conference on Machne Learnng, 2004, pp [41]. D.Y. Yeung, H. Chang, Extendng the relevant component analyss algorthm for metrc learnng usng both postve and negatve equvalence constrants, Pattern Recognton 39 (5) (2006) DOI: / Page

Feature Reduction and Selection

Feature Reduction and Selection Feature Reducton and Selecton Dr. Shuang LIANG School of Software Engneerng TongJ Unversty Fall, 2012 Today s Topcs Introducton Problems of Dmensonalty Feature Reducton Statstc methods Prncpal Components

More information

Face Recognition University at Buffalo CSE666 Lecture Slides Resources:

Face Recognition University at Buffalo CSE666 Lecture Slides Resources: Face Recognton Unversty at Buffalo CSE666 Lecture Sldes Resources: http://www.face-rec.org/algorthms/ Overvew of face recognton algorthms Correlaton - Pxel based correspondence between two face mages Structural

More information

LECTURE : MANIFOLD LEARNING

LECTURE : MANIFOLD LEARNING LECTURE : MANIFOLD LEARNING Rta Osadchy Some sldes are due to L.Saul, V. C. Raykar, N. Verma Topcs PCA MDS IsoMap LLE EgenMaps Done! Dmensonalty Reducton Data representaton Inputs are real-valued vectors

More information

Learning the Kernel Parameters in Kernel Minimum Distance Classifier

Learning the Kernel Parameters in Kernel Minimum Distance Classifier Learnng the Kernel Parameters n Kernel Mnmum Dstance Classfer Daoqang Zhang 1,, Songcan Chen and Zh-Hua Zhou 1* 1 Natonal Laboratory for Novel Software Technology Nanjng Unversty, Nanjng 193, Chna Department

More information

Subspace clustering. Clustering. Fundamental to all clustering techniques is the choice of distance measure between data points;

Subspace clustering. Clustering. Fundamental to all clustering techniques is the choice of distance measure between data points; Subspace clusterng Clusterng Fundamental to all clusterng technques s the choce of dstance measure between data ponts; D q ( ) ( ) 2 x x = x x, j k = 1 k jk Squared Eucldean dstance Assumpton: All features

More information

Cluster Analysis of Electrical Behavior

Cluster Analysis of Electrical Behavior Journal of Computer and Communcatons, 205, 3, 88-93 Publshed Onlne May 205 n ScRes. http://www.scrp.org/ournal/cc http://dx.do.org/0.4236/cc.205.350 Cluster Analyss of Electrcal Behavor Ln Lu Ln Lu, School

More information

Support Vector Machines

Support Vector Machines Support Vector Machnes Decson surface s a hyperplane (lne n 2D) n feature space (smlar to the Perceptron) Arguably, the most mportant recent dscovery n machne learnng In a nutshell: map the data to a predetermned

More information

Classifier Selection Based on Data Complexity Measures *

Classifier Selection Based on Data Complexity Measures * Classfer Selecton Based on Data Complexty Measures * Edth Hernández-Reyes, J.A. Carrasco-Ochoa, and J.Fco. Martínez-Trndad Natonal Insttute for Astrophyscs, Optcs and Electroncs, Lus Enrque Erro No.1 Sta.

More information

Laplacian Eigenmap for Image Retrieval

Laplacian Eigenmap for Image Retrieval Laplacan Egenmap for Image Retreval Xaofe He Partha Nyog Department of Computer Scence The Unversty of Chcago, 1100 E 58 th Street, Chcago, IL 60637 ABSTRACT Dmensonalty reducton has been receved much

More information

Support Vector Machines

Support Vector Machines /9/207 MIST.6060 Busness Intellgence and Data Mnng What are Support Vector Machnes? Support Vector Machnes Support Vector Machnes (SVMs) are supervsed learnng technques that analyze data and recognze patterns.

More information

Lecture 4: Principal components

Lecture 4: Principal components /3/6 Lecture 4: Prncpal components 3..6 Multvarate lnear regresson MLR s optmal for the estmaton data...but poor for handlng collnear data Covarance matrx s not nvertble (large condton number) Robustness

More information

Classification / Regression Support Vector Machines

Classification / Regression Support Vector Machines Classfcaton / Regresson Support Vector Machnes Jeff Howbert Introducton to Machne Learnng Wnter 04 Topcs SVM classfers for lnearly separable classes SVM classfers for non-lnearly separable classes SVM

More information

Discriminative Dictionary Learning with Pairwise Constraints

Discriminative Dictionary Learning with Pairwise Constraints Dscrmnatve Dctonary Learnng wth Parwse Constrants Humn Guo Zhuoln Jang LARRY S. DAVIS UNIVERSITY OF MARYLAND Nov. 6 th, Outlne Introducton/motvaton Dctonary Learnng Dscrmnatve Dctonary Learnng wth Parwse

More information

Collaboratively Regularized Nearest Points for Set Based Recognition

Collaboratively Regularized Nearest Points for Set Based Recognition Academc Center for Computng and Meda Studes, Kyoto Unversty Collaboratvely Regularzed Nearest Ponts for Set Based Recognton Yang Wu, Mchhko Mnoh, Masayuk Mukunok Kyoto Unversty 9/1/013 BMVC 013 @ Brstol,

More information

An Improved Spectral Clustering Algorithm Based on Local Neighbors in Kernel Space 1

An Improved Spectral Clustering Algorithm Based on Local Neighbors in Kernel Space 1 DOI: 10.98/CSIS110415064L An Improved Spectral Clusterng Algorthm Based on Local Neghbors n Kernel Space 1 Xnyue Lu 1,, Xng Yong and Hongfe Ln 1 1 School of Computer Scence and Technology, Dalan Unversty

More information

The Research of Support Vector Machine in Agricultural Data Classification

The Research of Support Vector Machine in Agricultural Data Classification The Research of Support Vector Machne n Agrcultural Data Classfcaton Le Sh, Qguo Duan, Xnmng Ma, Me Weng College of Informaton and Management Scence, HeNan Agrcultural Unversty, Zhengzhou 45000 Chna Zhengzhou

More information

Outline. Discriminative classifiers for image recognition. Where in the World? A nearest neighbor recognition example 4/14/2011. CS 376 Lecture 22 1

Outline. Discriminative classifiers for image recognition. Where in the World? A nearest neighbor recognition example 4/14/2011. CS 376 Lecture 22 1 4/14/011 Outlne Dscrmnatve classfers for mage recognton Wednesday, Aprl 13 Krsten Grauman UT-Austn Last tme: wndow-based generc obect detecton basc ppelne face detecton wth boostng as case study Today:

More information

Human Face Recognition Using Generalized. Kernel Fisher Discriminant

Human Face Recognition Using Generalized. Kernel Fisher Discriminant Human Face Recognton Usng Generalzed Kernel Fsher Dscrmnant ng-yu Sun,2 De-Shuang Huang Ln Guo. Insttute of Intellgent Machnes, Chnese Academy of Scences, P.O.ox 30, Hefe, Anhu, Chna. 2. Department of

More information

Semi-supervised Classification Using Local and Global Regularization

Semi-supervised Classification Using Local and Global Regularization Proceedngs of the Twenty-Thrd AAAI Conference on Artfcal Intellgence (2008) Sem-supervsed Classfcaton Usng Local and Global Regularzaton Fe Wang 1, Tao L 2, Gang Wang 3, Changshu Zhang 1 1 Department of

More information

Determining the Optimal Bandwidth Based on Multi-criterion Fusion

Determining the Optimal Bandwidth Based on Multi-criterion Fusion Proceedngs of 01 4th Internatonal Conference on Machne Learnng and Computng IPCSIT vol. 5 (01) (01) IACSIT Press, Sngapore Determnng the Optmal Bandwdth Based on Mult-crteron Fuson Ha-L Lang 1+, Xan-Mn

More information

Content Based Image Retrieval Using 2-D Discrete Wavelet with Texture Feature with Different Classifiers

Content Based Image Retrieval Using 2-D Discrete Wavelet with Texture Feature with Different Classifiers IOSR Journal of Electroncs and Communcaton Engneerng (IOSR-JECE) e-issn: 78-834,p- ISSN: 78-8735.Volume 9, Issue, Ver. IV (Mar - Apr. 04), PP 0-07 Content Based Image Retreval Usng -D Dscrete Wavelet wth

More information

CS 534: Computer Vision Model Fitting

CS 534: Computer Vision Model Fitting CS 534: Computer Vson Model Fttng Sprng 004 Ahmed Elgammal Dept of Computer Scence CS 534 Model Fttng - 1 Outlnes Model fttng s mportant Least-squares fttng Maxmum lkelhood estmaton MAP estmaton Robust

More information

Two-Dimensional Supervised Discriminant Projection Method For Feature Extraction

Two-Dimensional Supervised Discriminant Projection Method For Feature Extraction Appl. Math. Inf. c. 6 No. pp. 8-85 (0) Appled Mathematcs & Informaton cences An Internatonal Journal @ 0 NP Natural cences Publshng Cor. wo-dmensonal upervsed Dscrmnant Proecton Method For Feature Extracton

More information

Learning a Locality Preserving Subspace for Visual Recognition

Learning a Locality Preserving Subspace for Visual Recognition Learnng a Localty Preservng Subspace for Vsual Recognton Xaofe He *, Shucheng Yan #, Yuxao Hu, and Hong-Jang Zhang Mcrosoft Research Asa, Bejng 100080, Chna * Department of Computer Scence, Unversty of

More information

Parallelism for Nested Loops with Non-uniform and Flow Dependences

Parallelism for Nested Loops with Non-uniform and Flow Dependences Parallelsm for Nested Loops wth Non-unform and Flow Dependences Sam-Jn Jeong Dept. of Informaton & Communcaton Engneerng, Cheonan Unversty, 5, Anseo-dong, Cheonan, Chungnam, 330-80, Korea. seong@cheonan.ac.kr

More information

The Discriminate Analysis and Dimension Reduction Methods of High Dimension

The Discriminate Analysis and Dimension Reduction Methods of High Dimension Open Journal of Socal Scences, 015, 3, 7-13 Publshed Onlne March 015 n ScRes. http://www.scrp.org/journal/jss http://dx.do.org/10.436/jss.015.3300 The Dscrmnate Analyss and Dmenson Reducton Methods of

More information

BAYESIAN MULTI-SOURCE DOMAIN ADAPTATION

BAYESIAN MULTI-SOURCE DOMAIN ADAPTATION BAYESIAN MULTI-SOURCE DOMAIN ADAPTATION SHI-LIANG SUN, HONG-LEI SHI Department of Computer Scence and Technology, East Chna Normal Unversty 500 Dongchuan Road, Shangha 200241, P. R. Chna E-MAIL: slsun@cs.ecnu.edu.cn,

More information

Learning an Image Manifold for Retrieval

Learning an Image Manifold for Retrieval Learnng an Image Manfold for Retreval Xaofe He*, We-Yng Ma, and Hong-Jang Zhang Mcrosoft Research Asa Bejng, Chna, 100080 {wyma,hjzhang}@mcrosoft.com *Department of Computer Scence, The Unversty of Chcago

More information

Range images. Range image registration. Examples of sampling patterns. Range images and range surfaces

Range images. Range image registration. Examples of sampling patterns. Range images and range surfaces Range mages For many structured lght scanners, the range data forms a hghly regular pattern known as a range mage. he samplng pattern s determned by the specfc scanner. Range mage regstraton 1 Examples

More information

Geometry-aware Metric Learning

Geometry-aware Metric Learning Zhengdong Lu LUZ@CS.UTEXAS.EDU Prateek Jan PJAIN@CS.UTEXAS.EDU Inderjt S. Dhllon INDERJIT@CS.UTEXAS.EDU Dept. of Computer Scence, Unversty of Texas at Austn, Unversty Staton C5, Austn, TX 78712 Abstract

More information

Out-of-Sample Extensions for LLE, Isomap, MDS, Eigenmaps, and Spectral Clustering

Out-of-Sample Extensions for LLE, Isomap, MDS, Eigenmaps, and Spectral Clustering Out-of-Sample Extensons for LLE, Isomap, MDS, Egenmaps, and Spectral Clusterng Yoshua Bengo, Jean-Franços Paement, Pascal Vncent Olver Delalleau, Ncolas Le Roux and Mare Oumet Département d Informatque

More information

A Fast Content-Based Multimedia Retrieval Technique Using Compressed Data

A Fast Content-Based Multimedia Retrieval Technique Using Compressed Data A Fast Content-Based Multmeda Retreval Technque Usng Compressed Data Borko Furht and Pornvt Saksobhavvat NSF Multmeda Laboratory Florda Atlantc Unversty, Boca Raton, Florda 3343 ABSTRACT In ths paper,

More information

Outline. Type of Machine Learning. Examples of Application. Unsupervised Learning

Outline. Type of Machine Learning. Examples of Application. Unsupervised Learning Outlne Artfcal Intellgence and ts applcatons Lecture 8 Unsupervsed Learnng Professor Danel Yeung danyeung@eee.org Dr. Patrck Chan patrckchan@eee.org South Chna Unversty of Technology, Chna Introducton

More information

Face Recognition Method Based on Within-class Clustering SVM

Face Recognition Method Based on Within-class Clustering SVM Face Recognton Method Based on Wthn-class Clusterng SVM Yan Wu, Xao Yao and Yng Xa Department of Computer Scence and Engneerng Tong Unversty Shangha, Chna Abstract - A face recognton method based on Wthn-class

More information

Announcements. Supervised Learning

Announcements. Supervised Learning Announcements See Chapter 5 of Duda, Hart, and Stork. Tutoral by Burge lnked to on web page. Supervsed Learnng Classfcaton wth labeled eamples. Images vectors n hgh-d space. Supervsed Learnng Labeled eamples

More information

12/2/2009. Announcements. Parametric / Non-parametric. Case-Based Reasoning. Nearest-Neighbor on Images. Nearest-Neighbor Classification

12/2/2009. Announcements. Parametric / Non-parametric. Case-Based Reasoning. Nearest-Neighbor on Images. Nearest-Neighbor Classification Introducton to Artfcal Intellgence V22.0472-001 Fall 2009 Lecture 24: Nearest-Neghbors & Support Vector Machnes Rob Fergus Dept of Computer Scence, Courant Insttute, NYU Sldes from Danel Yeung, John DeNero

More information

Positive Semi-definite Programming Localization in Wireless Sensor Networks

Positive Semi-definite Programming Localization in Wireless Sensor Networks Postve Sem-defnte Programmng Localzaton n Wreless Sensor etworks Shengdong Xe 1,, Jn Wang, Aqun Hu 1, Yunl Gu, Jang Xu, 1 School of Informaton Scence and Engneerng, Southeast Unversty, 10096, anjng Computer

More information

Data-dependent Hashing Based on p-stable Distribution

Data-dependent Hashing Based on p-stable Distribution Data-depent Hashng Based on p-stable Dstrbuton Author Ba, Xao, Yang, Hachuan, Zhou, Jun, Ren, Peng, Cheng, Jan Publshed 24 Journal Ttle IEEE Transactons on Image Processng DOI https://do.org/.9/tip.24.2352458

More information

Detection of an Object by using Principal Component Analysis

Detection of an Object by using Principal Component Analysis Detecton of an Object by usng Prncpal Component Analyss 1. G. Nagaven, 2. Dr. T. Sreenvasulu Reddy 1. M.Tech, Department of EEE, SVUCE, Trupath, Inda. 2. Assoc. Professor, Department of ECE, SVUCE, Trupath,

More information

MULTISPECTRAL IMAGES CLASSIFICATION BASED ON KLT AND ATR AUTOMATIC TARGET RECOGNITION

MULTISPECTRAL IMAGES CLASSIFICATION BASED ON KLT AND ATR AUTOMATIC TARGET RECOGNITION MULTISPECTRAL IMAGES CLASSIFICATION BASED ON KLT AND ATR AUTOMATIC TARGET RECOGNITION Paulo Quntlano 1 & Antono Santa-Rosa 1 Federal Polce Department, Brasla, Brazl. E-mals: quntlano.pqs@dpf.gov.br and

More information

Sum of Linear and Fractional Multiobjective Programming Problem under Fuzzy Rules Constraints

Sum of Linear and Fractional Multiobjective Programming Problem under Fuzzy Rules Constraints Australan Journal of Basc and Appled Scences, 2(4): 1204-1208, 2008 ISSN 1991-8178 Sum of Lnear and Fractonal Multobjectve Programmng Problem under Fuzzy Rules Constrants 1 2 Sanjay Jan and Kalash Lachhwan

More information

A Fast Visual Tracking Algorithm Based on Circle Pixels Matching

A Fast Visual Tracking Algorithm Based on Circle Pixels Matching A Fast Vsual Trackng Algorthm Based on Crcle Pxels Matchng Zhqang Hou hou_zhq@sohu.com Chongzhao Han czhan@mal.xjtu.edu.cn Ln Zheng Abstract: A fast vsual trackng algorthm based on crcle pxels matchng

More information

Flatten a Curved Space by Kernel: From Einstein to Euclid

Flatten a Curved Space by Kernel: From Einstein to Euclid Flatten a Curved Space by Kernel: From Ensten to Eucld Quyuan Huang, Dapeng Olver Wu Ensten s general theory of relatvty fundamentally changed our vew about the physcal world. Dfferent from Newton s theory,

More information

Smoothing Spline ANOVA for variable screening

Smoothing Spline ANOVA for variable screening Smoothng Splne ANOVA for varable screenng a useful tool for metamodels tranng and mult-objectve optmzaton L. Rcco, E. Rgon, A. Turco Outlne RSM Introducton Possble couplng Test case MOO MOO wth Game Theory

More information

Semi-Supervised Kernel Mean Shift Clustering

Semi-Supervised Kernel Mean Shift Clustering IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, VOL. XX, NO. XX, JANUARY XXXX 1 Sem-Supervsed Kernel Mean Shft Clusterng Saket Anand, Student Member, IEEE, Sushl Mttal, Member, IEEE, Oncel

More information

Solving two-person zero-sum game by Matlab

Solving two-person zero-sum game by Matlab Appled Mechancs and Materals Onlne: 2011-02-02 ISSN: 1662-7482, Vols. 50-51, pp 262-265 do:10.4028/www.scentfc.net/amm.50-51.262 2011 Trans Tech Publcatons, Swtzerland Solvng two-person zero-sum game by

More information

Machine Learning 9. week

Machine Learning 9. week Machne Learnng 9. week Mappng Concept Radal Bass Functons (RBF) RBF Networks 1 Mappng It s probably the best scenaro for the classfcaton of two dataset s to separate them lnearly. As you see n the below

More information

Recognizing Faces. Outline

Recognizing Faces. Outline Recognzng Faces Drk Colbry Outlne Introducton and Motvaton Defnng a feature vector Prncpal Component Analyss Lnear Dscrmnate Analyss !"" #$""% http://www.nfotech.oulu.f/annual/2004 + &'()*) '+)* 2 ! &

More information

SLAM Summer School 2006 Practical 2: SLAM using Monocular Vision

SLAM Summer School 2006 Practical 2: SLAM using Monocular Vision SLAM Summer School 2006 Practcal 2: SLAM usng Monocular Vson Javer Cvera, Unversty of Zaragoza Andrew J. Davson, Imperal College London J.M.M Montel, Unversty of Zaragoza. josemar@unzar.es, jcvera@unzar.es,

More information

Support Vector Machines. CS534 - Machine Learning

Support Vector Machines. CS534 - Machine Learning Support Vector Machnes CS534 - Machne Learnng Perceptron Revsted: Lnear Separators Bnar classfcaton can be veed as the task of separatng classes n feature space: b > 0 b 0 b < 0 f() sgn( b) Lnear Separators

More information

Infrared face recognition using texture descriptors

Infrared face recognition using texture descriptors Infrared face recognton usng texture descrptors Moulay A. Akhlouf*, Abdelhakm Bendada Computer Vson and Systems Laboratory, Laval Unversty, Quebec, QC, Canada G1V0A6 ABSTRACT Face recognton s an area of

More information

IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS FOR VIDEO TECHNOLOGY 1. SSDH: Semi-supervised Deep Hashing for Large Scale Image Retrieval

IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS FOR VIDEO TECHNOLOGY 1. SSDH: Semi-supervised Deep Hashing for Large Scale Image Retrieval IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS FOR VIDEO TECHNOLOGY SSDH: Sem-supervsed Deep Hashng for Large Scale Image Retreval Jan Zhang, and Yuxn Peng arxv:607.08477v2 [cs.cv] 8 Jun 207 Abstract Hashng

More information

Learning a Class-Specific Dictionary for Facial Expression Recognition

Learning a Class-Specific Dictionary for Facial Expression Recognition BULGARIAN ACADEMY OF SCIENCES CYBERNETICS AND INFORMATION TECHNOLOGIES Volume 16, No 4 Sofa 016 Prnt ISSN: 1311-970; Onlne ISSN: 1314-4081 DOI: 10.1515/cat-016-0067 Learnng a Class-Specfc Dctonary for

More information

Impact of a New Attribute Extraction Algorithm on Web Page Classification

Impact of a New Attribute Extraction Algorithm on Web Page Classification Impact of a New Attrbute Extracton Algorthm on Web Page Classfcaton Gösel Brc, Banu Dr, Yldz Techncal Unversty, Computer Engneerng Department Abstract Ths paper ntroduces a new algorthm for dmensonalty

More information

An Application of the Dulmage-Mendelsohn Decomposition to Sparse Null Space Bases of Full Row Rank Matrices

An Application of the Dulmage-Mendelsohn Decomposition to Sparse Null Space Bases of Full Row Rank Matrices Internatonal Mathematcal Forum, Vol 7, 2012, no 52, 2549-2554 An Applcaton of the Dulmage-Mendelsohn Decomposton to Sparse Null Space Bases of Full Row Rank Matrces Mostafa Khorramzadeh Department of Mathematcal

More information

Edge Detection in Noisy Images Using the Support Vector Machines

Edge Detection in Noisy Images Using the Support Vector Machines Edge Detecton n Nosy Images Usng the Support Vector Machnes Hlaro Gómez-Moreno, Saturnno Maldonado-Bascón, Francsco López-Ferreras Sgnal Theory and Communcatons Department. Unversty of Alcalá Crta. Madrd-Barcelona

More information

Tsinghua University at TAC 2009: Summarizing Multi-documents by Information Distance

Tsinghua University at TAC 2009: Summarizing Multi-documents by Information Distance Tsnghua Unversty at TAC 2009: Summarzng Mult-documents by Informaton Dstance Chong Long, Mnle Huang, Xaoyan Zhu State Key Laboratory of Intellgent Technology and Systems, Tsnghua Natonal Laboratory for

More information

Unsupervised Learning

Unsupervised Learning Pattern Recognton Lecture 8 Outlne Introducton Unsupervsed Learnng Parametrc VS Non-Parametrc Approach Mxture of Denstes Maxmum-Lkelhood Estmates Clusterng Prof. Danel Yeung School of Computer Scence and

More information

An Optimal Algorithm for Prufer Codes *

An Optimal Algorithm for Prufer Codes * J. Software Engneerng & Applcatons, 2009, 2: 111-115 do:10.4236/jsea.2009.22016 Publshed Onlne July 2009 (www.scrp.org/journal/jsea) An Optmal Algorthm for Prufer Codes * Xaodong Wang 1, 2, Le Wang 3,

More information

Term Weighting Classification System Using the Chi-square Statistic for the Classification Subtask at NTCIR-6 Patent Retrieval Task

Term Weighting Classification System Using the Chi-square Statistic for the Classification Subtask at NTCIR-6 Patent Retrieval Task Proceedngs of NTCIR-6 Workshop Meetng, May 15-18, 2007, Tokyo, Japan Term Weghtng Classfcaton System Usng the Ch-square Statstc for the Classfcaton Subtask at NTCIR-6 Patent Retreval Task Kotaro Hashmoto

More information

Load Balancing for Hex-Cell Interconnection Network

Load Balancing for Hex-Cell Interconnection Network Int. J. Communcatons, Network and System Scences,,, - Publshed Onlne Aprl n ScRes. http://www.scrp.org/journal/jcns http://dx.do.org/./jcns.. Load Balancng for Hex-Cell Interconnecton Network Saher Manaseer,

More information

Incremental MQDF Learning for Writer Adaptive Handwriting Recognition 1

Incremental MQDF Learning for Writer Adaptive Handwriting Recognition 1 200 2th Internatonal Conference on Fronters n Handwrtng Recognton Incremental MQDF Learnng for Wrter Adaptve Handwrtng Recognton Ka Dng, Lanwen Jn * School of Electronc and Informaton Engneerng, South

More information

Face Recognition Based on SVM and 2DPCA

Face Recognition Based on SVM and 2DPCA Vol. 4, o. 3, September, 2011 Face Recognton Based on SVM and 2DPCA Tha Hoang Le, Len Bu Faculty of Informaton Technology, HCMC Unversty of Scence Faculty of Informaton Scences and Engneerng, Unversty

More information

Outline. Self-Organizing Maps (SOM) US Hebbian Learning, Cntd. The learning rule is Hebbian like:

Outline. Self-Organizing Maps (SOM) US Hebbian Learning, Cntd. The learning rule is Hebbian like: Self-Organzng Maps (SOM) Turgay İBRİKÇİ, PhD. Outlne Introducton Structures of SOM SOM Archtecture Neghborhoods SOM Algorthm Examples Summary 1 2 Unsupervsed Hebban Learnng US Hebban Learnng, Cntd 3 A

More information

A New Approach For the Ranking of Fuzzy Sets With Different Heights

A New Approach For the Ranking of Fuzzy Sets With Different Heights New pproach For the ankng of Fuzzy Sets Wth Dfferent Heghts Pushpnder Sngh School of Mathematcs Computer pplcatons Thapar Unversty, Patala-7 00 Inda pushpndersnl@gmalcom STCT ankng of fuzzy sets plays

More information

SELECTION OF THE NUMBER OF NEIGHBOURS OF EACH DATA POINT FOR THE LOCALLY LINEAR EMBEDDING ALGORITHM

SELECTION OF THE NUMBER OF NEIGHBOURS OF EACH DATA POINT FOR THE LOCALLY LINEAR EMBEDDING ALGORITHM ISSN 392 24X INFORMATION TECHNOLOGY AND CONTROL, 2007, Vol.36, No.4 SELECTION OF THE NUMBER OF NEIGHBOURS OF EACH DATA POINT FOR THE LOCALLY LINEAR EMBEDDING ALGORITHM Rasa Karbauskatė,2, Olga Kurasova,2,

More information

Supervised Nonlinear Dimensionality Reduction for Visualization and Classification

Supervised Nonlinear Dimensionality Reduction for Visualization and Classification IEEE Transactons on Systems, Man, and Cybernetcs Part B: Cybernetcs 1 Supervsed Nonlnear Dmensonalty Reducton for Vsualzaton and Classfcaton Xn Geng, De-Chuan Zhan, and Zh-Hua Zhou, Member, IEEE Abstract

More information

Classification of Face Images Based on Gender using Dimensionality Reduction Techniques and SVM

Classification of Face Images Based on Gender using Dimensionality Reduction Techniques and SVM Classfcaton of Face Images Based on Gender usng Dmensonalty Reducton Technques and SVM Fahm Mannan 260 266 294 School of Computer Scence McGll Unversty Abstract Ths report presents gender classfcaton based

More information

Load-Balanced Anycast Routing

Load-Balanced Anycast Routing Load-Balanced Anycast Routng Chng-Yu Ln, Jung-Hua Lo, and Sy-Yen Kuo Department of Electrcal Engneerng atonal Tawan Unversty, Tape, Tawan sykuo@cc.ee.ntu.edu.tw Abstract For fault-tolerance and load-balance

More information

An Accurate Evaluation of Integrals in Convex and Non convex Polygonal Domain by Twelve Node Quadrilateral Finite Element Method

An Accurate Evaluation of Integrals in Convex and Non convex Polygonal Domain by Twelve Node Quadrilateral Finite Element Method Internatonal Journal of Computatonal and Appled Mathematcs. ISSN 89-4966 Volume, Number (07), pp. 33-4 Research Inda Publcatons http://www.rpublcaton.com An Accurate Evaluaton of Integrals n Convex and

More information

Graph-based Clustering

Graph-based Clustering Graphbased Clusterng Transform the data nto a graph representaton ertces are the data ponts to be clustered Edges are eghted based on smlarty beteen data ponts Graph parttonng Þ Each connected component

More information

MULTI-VIEW ANCHOR GRAPH HASHING

MULTI-VIEW ANCHOR GRAPH HASHING MULTI-VIEW ANCHOR GRAPH HASHING Saehoon Km 1 and Seungjn Cho 1,2 1 Department of Computer Scence and Engneerng, POSTECH, Korea 2 Dvson of IT Convergence Engneerng, POSTECH, Korea {kshkawa, seungjn}@postech.ac.kr

More information

Machine Learning: Algorithms and Applications

Machine Learning: Algorithms and Applications 14/05/1 Machne Learnng: Algorthms and Applcatons Florano Zn Free Unversty of Bozen-Bolzano Faculty of Computer Scence Academc Year 011-01 Lecture 10: 14 May 01 Unsupervsed Learnng cont Sldes courtesy of

More information

Facial Expression Recognition Based on Local Binary Patterns and Local Fisher Discriminant Analysis

Facial Expression Recognition Based on Local Binary Patterns and Local Fisher Discriminant Analysis WSEAS RANSACIONS on SIGNAL PROCESSING Shqng Zhang, Xaomng Zhao, Bcheng Le Facal Expresson Recognton Based on Local Bnary Patterns and Local Fsher Dscrmnant Analyss SHIQING ZHANG, XIAOMING ZHAO, BICHENG

More information

Machine Learning. Support Vector Machines. (contains material adapted from talks by Constantin F. Aliferis & Ioannis Tsamardinos, and Martin Law)

Machine Learning. Support Vector Machines. (contains material adapted from talks by Constantin F. Aliferis & Ioannis Tsamardinos, and Martin Law) Machne Learnng Support Vector Machnes (contans materal adapted from talks by Constantn F. Alfers & Ioanns Tsamardnos, and Martn Law) Bryan Pardo, Machne Learnng: EECS 349 Fall 2014 Support Vector Machnes

More information

Feature Extraction Based on Maximum Nearest Subspace Margin Criterion

Feature Extraction Based on Maximum Nearest Subspace Margin Criterion Neural Process Lett DOI 10.7/s11063-012-9252-y Feature Extracton Based on Maxmum Nearest Subspace Margn Crteron Y Chen Zhenzhen L Zhong Jn Sprnger Scence+Busness Meda New York 2012 Abstract Based on the

More information

Kernel Collaborative Representation Classification Based on Adaptive Dictionary Learning

Kernel Collaborative Representation Classification Based on Adaptive Dictionary Learning Internatonal Journal of Intellgent Informaton Systems 2018; 7(2): 15-22 http://www.scencepublshnggroup.com/j/js do: 10.11648/j.js.20180702.11 ISSN: 2328-7675 (Prnt); ISSN: 2328-7683 (Onlne) Kernel Collaboratve

More information

New l 1 -Norm Relaxations and Optimizations for Graph Clustering

New l 1 -Norm Relaxations and Optimizations for Graph Clustering Proceedngs of the Thrteth AAAI Conference on Artfcal Intellgence (AAAI-6) New l -Norm Relaxatons and Optmzatons for Graph Clusterng Fepng Ne, Hua Wang, Cheng Deng 3, Xnbo Gao 3, Xuelong L 4, Heng Huang

More information

Local Quaternary Patterns and Feature Local Quaternary Patterns

Local Quaternary Patterns and Feature Local Quaternary Patterns Local Quaternary Patterns and Feature Local Quaternary Patterns Jayu Gu and Chengjun Lu The Department of Computer Scence, New Jersey Insttute of Technology, Newark, NJ 0102, USA Abstract - Ths paper presents

More information

An Empirical Comparative Study of Online Handwriting Chinese Character Recognition:Simplified v.s.traditional

An Empirical Comparative Study of Online Handwriting Chinese Character Recognition:Simplified v.s.traditional 2013 12th Internatonal Conference on Document Analyss and Recognton An Emprcal Comparatve Study of Onlne Handwrtng Chnese Recognton:Smplfed v.s.tradtonal Yan Gao, Lanwen Jn +, Wexn Yang School of Electronc

More information

A Deflected Grid-based Algorithm for Clustering Analysis

A Deflected Grid-based Algorithm for Clustering Analysis A Deflected Grd-based Algorthm for Clusterng Analyss NANCY P. LIN, CHUNG-I CHANG, HAO-EN CHUEH, HUNG-JEN CHEN, WEI-HUA HAO Department of Computer Scence and Informaton Engneerng Tamkang Unversty 5 Yng-chuan

More information

Intelligent Information Acquisition for Improved Clustering

Intelligent Information Acquisition for Improved Clustering Intellgent Informaton Acquston for Improved Clusterng Duy Vu Unversty of Texas at Austn duyvu@cs.utexas.edu Mkhal Blenko Mcrosoft Research mblenko@mcrosoft.com Prem Melvlle IBM T.J. Watson Research Center

More information

Proper Choice of Data Used for the Estimation of Datum Transformation Parameters

Proper Choice of Data Used for the Estimation of Datum Transformation Parameters Proper Choce of Data Used for the Estmaton of Datum Transformaton Parameters Hakan S. KUTOGLU, Turkey Key words: Coordnate systems; transformaton; estmaton, relablty. SUMMARY Advances n technologes and

More information

A New Measure of Cluster Validity Using Line Symmetry *

A New Measure of Cluster Validity Using Line Symmetry * JOURAL OF IFORMATIO SCIECE AD EGIEERIG 30, 443-461 (014) A ew Measure of Cluster Valdty Usng Lne Symmetry * CHIE-HSIG CHOU 1, YI-ZEG HSIEH AD MU-CHU SU 1 Department of Electrcal Engneerng Tamkang Unversty

More information

Hybridization of Expectation-Maximization and K-Means Algorithms for Better Clustering Performance

Hybridization of Expectation-Maximization and K-Means Algorithms for Better Clustering Performance BULGARIAN ACADEMY OF SCIENCES CYBERNETICS AND INFORMATION TECHNOLOGIES Volume 16, No 2 Sofa 2016 Prnt ISSN: 1311-9702; Onlne ISSN: 1314-4081 DOI: 10.1515/cat-2016-0017 Hybrdzaton of Expectaton-Maxmzaton

More information

FEATURE EXTRACTION. Dr. K.Vijayarekha. Associate Dean School of Electrical and Electronics Engineering SASTRA University, Thanjavur

FEATURE EXTRACTION. Dr. K.Vijayarekha. Associate Dean School of Electrical and Electronics Engineering SASTRA University, Thanjavur FEATURE EXTRACTION Dr. K.Vjayarekha Assocate Dean School of Electrcal and Electroncs Engneerng SASTRA Unversty, Thanjavur613 41 Jont Intatve of IITs and IISc Funded by MHRD Page 1 of 8 Table of Contents

More information

Small Network Segmentation with Template Guidance

Small Network Segmentation with Template Guidance Small Network Segmentaton wth Template Gudance Krstn Dane Lu Department of Mathematcs Unversty of Calforna, Davs Davs, CA 95616 kdlu@math.ucdavs.edu Ian Davdson Department of Computer Scence Unversty of

More information

BOOSTING CLASSIFICATION ACCURACY WITH SAMPLES CHOSEN FROM A VALIDATION SET

BOOSTING CLASSIFICATION ACCURACY WITH SAMPLES CHOSEN FROM A VALIDATION SET 1 BOOSTING CLASSIFICATION ACCURACY WITH SAMPLES CHOSEN FROM A VALIDATION SET TZU-CHENG CHUANG School of Electrcal and Computer Engneerng, Purdue Unversty, West Lafayette, Indana 47907 SAUL B. GELFAND School

More information

Improvement of Spatial Resolution Using BlockMatching Based Motion Estimation and Frame. Integration

Improvement of Spatial Resolution Using BlockMatching Based Motion Estimation and Frame. Integration Improvement of Spatal Resoluton Usng BlockMatchng Based Moton Estmaton and Frame Integraton Danya Suga and Takayuk Hamamoto Graduate School of Engneerng, Tokyo Unversty of Scence, 6-3-1, Nuku, Katsuska-ku,

More information

Training of Kernel Fuzzy Classifiers by Dynamic Cluster Generation

Training of Kernel Fuzzy Classifiers by Dynamic Cluster Generation Tranng of Kernel Fuzzy Classfers by Dynamc Cluster Generaton Shgeo Abe Graduate School of Scence and Technology Kobe Unversty Nada, Kobe, Japan abe@eedept.kobe-u.ac.jp Abstract We dscuss kernel fuzzy classfers

More information

RECOGNIZING GENDER THROUGH FACIAL IMAGE USING SUPPORT VECTOR MACHINE

RECOGNIZING GENDER THROUGH FACIAL IMAGE USING SUPPORT VECTOR MACHINE Journal of Theoretcal and Appled Informaton Technology 30 th June 06. Vol.88. No.3 005-06 JATIT & LLS. All rghts reserved. ISSN: 99-8645 www.jatt.org E-ISSN: 87-395 RECOGNIZING GENDER THROUGH FACIAL IMAGE

More information

6.854 Advanced Algorithms Petar Maymounkov Problem Set 11 (November 23, 2005) With: Benjamin Rossman, Oren Weimann, and Pouya Kheradpour

6.854 Advanced Algorithms Petar Maymounkov Problem Set 11 (November 23, 2005) With: Benjamin Rossman, Oren Weimann, and Pouya Kheradpour 6.854 Advanced Algorthms Petar Maymounkov Problem Set 11 (November 23, 2005) Wth: Benjamn Rossman, Oren Wemann, and Pouya Kheradpour Problem 1. We reduce vertex cover to MAX-SAT wth weghts, such that the

More information

Image Representation & Visualization Basic Imaging Algorithms Shape Representation and Analysis. outline

Image Representation & Visualization Basic Imaging Algorithms Shape Representation and Analysis. outline mage Vsualzaton mage Vsualzaton mage Representaton & Vsualzaton Basc magng Algorthms Shape Representaton and Analyss outlne mage Representaton & Vsualzaton Basc magng Algorthms Shape Representaton and

More information

Tone-Aware Sparse Representation for Face Recognition

Tone-Aware Sparse Representation for Face Recognition Tone-Aware Sparse Representaton for Face Recognton Lngfeng Wang, Huayu Wu and Chunhong Pan Abstract It s stll a very challengng task to recognze a face n a real world scenaro, snce the face may be corrupted

More information

Problem Definitions and Evaluation Criteria for Computational Expensive Optimization

Problem Definitions and Evaluation Criteria for Computational Expensive Optimization Problem efntons and Evaluaton Crtera for Computatonal Expensve Optmzaton B. Lu 1, Q. Chen and Q. Zhang 3, J. J. Lang 4, P. N. Suganthan, B. Y. Qu 6 1 epartment of Computng, Glyndwr Unversty, UK Faclty

More information

Locality Preserving Nonnegative Matrix Factorization

Locality Preserving Nonnegative Matrix Factorization Localty Preservng Nonnegatve Matrx Factorzaton Deng Ca Xaofe He Xuanhu Wang Hujun Bao Jawe Han State Key Lab of CAD&CG, College of Computer Scence, Zhejang Unversty, Chna {dengca, xaofehe, bao}@cad.zju.edu.cn

More information

High-Boost Mesh Filtering for 3-D Shape Enhancement

High-Boost Mesh Filtering for 3-D Shape Enhancement Hgh-Boost Mesh Flterng for 3-D Shape Enhancement Hrokazu Yagou Λ Alexander Belyaev y Damng We z Λ y z ; ; Shape Modelng Laboratory, Unversty of Azu, Azu-Wakamatsu 965-8580 Japan y Computer Graphcs Group,

More information

Sparse Transfer Learning for Interactive Video Search Reranking

Sparse Transfer Learning for Interactive Video Search Reranking Sparse ransfer Learnng for Interactve Vdeo Search Rerankng XIMEI IA AD DACHEG AO, anyang echnologcal Unversty YOG RUI, Mcrosoft Advanced echnology Center Vsual rerankng s effectve to mprove the performance

More information

An Image Fusion Approach Based on Segmentation Region

An Image Fusion Approach Based on Segmentation Region Rong Wang, L-Qun Gao, Shu Yang, Yu-Hua Cha, and Yan-Chun Lu An Image Fuson Approach Based On Segmentaton Regon An Image Fuson Approach Based on Segmentaton Regon Rong Wang, L-Qun Gao, Shu Yang 3, Yu-Hua

More information

The Greedy Method. Outline and Reading. Change Money Problem. Greedy Algorithms. Applications of the Greedy Strategy. The Greedy Method Technique

The Greedy Method. Outline and Reading. Change Money Problem. Greedy Algorithms. Applications of the Greedy Strategy. The Greedy Method Technique //00 :0 AM Outlne and Readng The Greedy Method The Greedy Method Technque (secton.) Fractonal Knapsack Problem (secton..) Task Schedulng (secton..) Mnmum Spannng Trees (secton.) Change Money Problem Greedy

More information