A new Unsupervised Clustering-based Feature Extraction Method

Size: px
Start display at page:

Download "A new Unsupervised Clustering-based Feature Extraction Method"

Transcription

1 A new Unsupervsed Clusterng-based Feature Extracton Method Sabra El Ferchch ACS Natonal School of Engneerng at Tuns, Tunsa Salah Zd AGIS lle Unversty of Scence and Technology, France Kaouther aabd ACS Natonal School of Engneerng at Tuns, Tunsa ABSTRACT In manpulatng data such as n supervsed or unsupervsed learnng, we need to extract new features from the orgnal features for the purpose of reducng the dmenson of feature space and achevng better performance. In ths paper, we nvestgate a novel schema for unsupervsed feature extracton for classfcaton problems. We based our method on clusterng to acheve feature extracton. A new smlarty measure based on trend analyss s devsed to dentfy redundant nformaton n the data. Clusterng s then performed on the feature space. Once groups of smlar features are formed, lnear transformaton s realzed to extract a new set of features. The smulaton results on classfcaton problems for expermental data sets from UCI machne learnng repostory and face recognton problem show that the proposed method s effectve n almost cases when compared to conventonal unsupervsed methods le PCA and ICA. General Terms Pattern Recognton, Machne earnng, Feature Extracton, and Data mnng. Keywords Unsupervsed feature extracton, smlarty measure, clusterng, face recognton and classfcaton problems. 1. INTRODUCTION Feature extracton s an mportant step n data analyss. It can be used as a preprocessor for applcatons ncludng vsualzaton, classfcaton, detecton, and verfcaton. Heren, we are nterested n applyng feature extracton to classfcaton problems and n partcular for face recognton problem. Actually, as the amount of collected data becomes larger, extractng the rght features becomes necessary for an effcent classfcaton process. However, when desgnng a classfcaton system, pror nowledge about effectve features n classfcaton processng, are usually unnown. Although the classfcaton performance s a non-decreasng functon of the number of features, usng a large number of features can be wasteful of both computatonal and memory resources [1]. In addton, tranng a classfer wth a large number of features descrbng a fnte amount of data can, eventually, degrade the classfer accuracy [2]. Two dfferent approaches exst to address ths ssue: feature selecton whch conssts on selectng only relevant attrbutes through a pre-defned crteron for feature relevance [3]; and feature extracton whch apples a lnear or nonlnear transformatons to the orgnal set of features to construct new ones [4]. In fact, dfferent Feature Selecton strateges lead ether to a combnatoral problem or suboptmal soluton [4]. For ths very reason, fndng a transformaton to lower dmensons mght be easer than selectng features, gven an approprate obectve functon. Accordng to whether the class labels are used, Feature Extracton methods can be dvded nto supervsed and unsupervsed ones. When labeled data are suffcent, supervsed feature extracton methods usually outperforms unsupervsed feature extracton methods [5]. However, n many cases obtanng class label s expensve and the amount of labeled tranng data s often very lmted, supervsed feature extracton methods may fal on such small labeled-sample problem. In ths wor, we are rather nterested n unsupervsed feature extracton, where no pror nowledge about pdfs of data or about ts class-dstrbuton s avalable. It has been reported through some experments that the performance of the classfer system can be deterorated as new rrelevant features are added [7]. Ths problem can be avoded by extractng new features contanng the maxmal nformaton about the data. One well nown unsupervsed feature extracton method s Prncpal Component Analyss (PCA) [6]. It s derved from egenvectors correspondng to the largest egenvalues of the covarance matrx for data. PCA sees to optmally represent the data n terms of mnmal mean square error between the representaton and the orgnal data. The man drawbac of ths method s that the extracted features are not nvarant under transformaton. Merely scalng the attrbutes changes the resultng features [7]. However, t may be useful n reducng nose n the data by separatng sgnal and nose subspace. Another unsupervsed method Independent Component Analyss (ICA), has been proposed as a tool to fnd nterestng proectons of the data [7]. It has been devsed for blnd source separaton problems, and has demonstrated a lot of potental n varous applcatons. It produces statstcally ndependent components based on negentropy (dvergence to a Gaussan densty functon) maxmzaton. Hence, feature extracton methods le PCA and ICA and feature selecton methods tres ether to fnd new drectons that are statstcally ndependent, or to elmnate totally the so ugged rrelevant or redundant features wthn a specfc crteron. Alternatve approach s to reduce feature dmensonalty by groupng smlar features nto a much smaller number of feature-clusters, and use these clusters as features. Hence, nformaton contaned n redundant features could be preserved whle the sze of the model s reduced and good performance s mantaned. The crucal stage n such procedures s how to determne the smlarty of features. Ths technque has been used by other authors especally for text classfcaton problems [8, 9] and proten sequences analyss [10]. The frst researches nvestgate the use of a clusterng technques based on a smlarty measure based- Informaton Bottlenec for text categorzaton and the latter use a bologcally motvated smlarty measure based on the 43

2 contact energy of the amno acd and the poston n the sequence for the predcton of HIV protease resstance to drugs. Hence, ths wor ams to develop a new method whch nvestgates the use of a new smlarty-based clusterng technque, to perform unsupervsed feature extracton for pattern classfcaton problems and face recognton n partcular. The new smlarty measure that we propose n ths wor has more general capabltes than the ones dscussed above. Actually, n hgh dmensonal data there are many features that have smlar tendences all along the nstances of the data set: they descrbe smlar varatons of monotoncty (ncreasng or decreasng). Thus, these features may gve related or the same dscrmnatve nformaton for the learnng process. Ths maes such features redundant and useless. Analyzng varatons of monotoncty of each feature along the data set can lead us to determne a form of redundancy between features. By usng trend analyss, each feature wll be totally descrbed by ts sgnature whch s statstcally dstngushed from random behavor. Intutvely, once groups of smlar features have been settled, feature extracton can be realzed. A lnear transformaton s appled on each dentfed cluster of smlar features. The rest of ths paper s organzed as follows: n secton 2, we ntroduce the new unsupervsed method to extract features based on analyzng ther tendences along the data set. A clusterng technque based on a new measure of smlarty between features s used to dentfy and gather smlar features. In secton 3, performance of the proposed Feature Extracton Method based on Clusterng (FEMC) s assessed through classfcaton problem and face recognton tas. We compare our method to conventonal lnear unsupervsed method PCA and ICA. In secton 4, we offer some conclusons and suggestons for future wor. 2. FEATURE EXTRACTION METHOD BASED CUSTERING In ths secton, we ntroduce our new unsupervsed feature extracton method based on clusterng (FEMC) for pattern classfcaton problems. In fact, redundant nformaton s an ntrnsc characterstc of hgh dmensonal data whch may complcate learnng tas and degrade classfer performance [7]. Hence, dentfyng redundant features n data can lead to reduce the dmenson wthout loss of some mportant nformaton. Exstng feature selecton solutons use a flter method to select relevant features and elmnate totally redundant features. Although redundant nformaton s not necessarly relevant for establshng dscrmnatng rules of classfcaton, but t has an underlyng nteracton wth the rest of features. Elmnatng them totally from feature space may lead to elmnate some predctve nformaton of the nherent structure of data and thereby nduce a less accurate classfer. Our goal s then to dentfy and then transform ths form of redundancy n a way that conserves nherent structure of data, and mnmzes the amount of predctve nformaton lost. Intutvely, smlar features descrbe very smlar or mostly the same varatons of monotoncty along the data set. They mght ncorporate the same dscrmnatng nformaton. Identfyng ths form of smlarty, through an approprate metrc, groupng them nto dfferent clusters and applyng a lnear transform on each group, are the dfferent steps of our feature extracton approach. Hence, we based our method on clusterng algorthm based a new smlarty measure to dentfy lnear or complex relatons that would exst between features. Clusterng s supposed to dscover nherent structure of data [11-12]. It s a method to creatng groups of obects, or clusters, n such a way smlar obects are grouped together whle those that are dfferent are segregated n ther dstnct samples but to approxmate the true partton of the functon Q.The new smlarty measure used n feature clusterng s devsed based on trend analyss of each feature vector along the data set. 2.1 Problem Formulaton A feature extracton process transforms, lnearly or nonlnearly, a set of orgnal features nto new ones. The new features are supposed to descrbe completely the nstances of a database and to be useful ether supervsed or unsupervsed classfcaton tas. Merely, consderng a set of D-dmensonal data D x, x,... x, a feature extracton process tres to fnd a 1 2 transformaton T to apply, such that: y T x ;1. (1) d D The transformed sample y s composed of d D new features v 1, v 2,... vd. Each feature vector v s consttuted by the dfferent values correspondng to each nstance or sample x 1. A new metrc s devsed to characterze smlarty between features and a clusterng algorthm s appled on the feature space v 1, v 2,... vd to gather smlar features. The new smlarty measure formulaton used to characterze smlarty between features, s nspred from trend analyss. It wll be explaned n the next secton. Hence, the obectve functon Q to be optmzed by the clusterng procedure can be defned by: d Q d, (2) 1 1 The dstance d between two feature vector v and v has to be approprately defned to get to an optmum result. Hence, clusterng process fnds smlar features and form d clusters where d D, represented each by ts centrod: g f S, (3) Where, S s the set of n feature vectors vs S belongng to the cluster C and the transformaton f s defned by: f S s 1 s1 n w v. (4) Fnally, the obtaned set of centrods g s the set of the d new features that wll re-descrbe the data set D x, x,... x Smlarty Measure Dstance or smlarty relatonshps between pars of patterns are the most mportant nformaton n clusterng process, to approxmate true partton n a data set. FECM focuses on defnng a smlarty measure that characterzes smlarty n the behavor of each par of features. We propose to analyze ther tendences through studyng and comparng ther varatons of monotoncty along the data set rather than dfference between ther real values. Thus, conventonal 44

3 dstance le Eucldean dstance used normally n clusterng algorthm s not sutable for our obectve. Usng Eucldean dstance may lead to erroneous results snce t computes the mean of dfference between each value of two vectors wthout use of tendency nformaton about them. In fact, two features may have the same mean (or closer means) but they dffer completely n ther trend. To descrbe the monotoncty of each feature vector, we used trend analyss. Actually, a trend s sem-quanttatve nformaton, descrbng the evoluton of the qualtatve state of a varable, n a tme nterval, usng a set of symbols such as {Increasng, Decreasng, Steady}[13]. In our case, the trend descrbes the evoluton of each feature n a fnte set of samples. We proceed by computng the correspondng frst order dervatve of a feature vector v at each pont or sample x : (5) dv v v dx x x 1 1 Then, we determne the sgn of the dervatve n each pont by (6). A feature vector v s then beng represented as an - dmensonal vector composed of a new set of varables 1, 1,0. Dstance functon devsed to compare two feature vectors reles on verfyng dfference n the sgn of tendency between two feature vectors. dv 0 dx 1 decrease dv dv f 0 then sgn 1 ncrease. dx dx dv 0 steady 0 dx (6) It s the squared sum of the absolute dfference between the occurrence of a specfed value of for two gven feature vectors. It was nspred from the Value Dfference Metrc (VDM) [14]. Thus, the locaton of a feature vector wthn the feature space s not defned drectly by the values of ts components, but by the condtonal dstrbutons of the extracted trend n each component. That maes the dstance between two features s ndependent from the order of data. The dstance or smlarty measure s gven by: (7) Where d v, v v, v v, v v, v, v v pv pv, / /, (8) Occurrence of n v pv /. (9). pv / s determned by countng haw many tmes the value occurs n the feature vector v for the learnng data set. In fact, n ths wor we have computed the occurrence of the par of varables, 10,11,1 1, 10, 11, 11,00,01,0 1 n each vector v nstead of computng only the probablty of the sngle varable 1, 1,0. A smlarty matrx D between all features vectors s then generated such that D, d v, v ;, 1... n. 2.3 Feature Extracton Schema Feature extracton algorthm that we propose n ths wor, s gven by the Fgure 1. Clusterng algorthm based on the smlarty measure descrbed below, s used to form clusters of smlar features. It conssts at a frst step on computng the smlarty matrx M between each one of the feature set, based on the metrc defned prevously by (7). The second step s to perform clusterng strategy whch s based on C- means clusterng. We fx the number of clusters, whch correspond ndrectly to the number of new extracted features such that: d ndv, where the threshold s chosen emprcally. Intally, clusters are ntalzed randomly. Then each cluster C s formed sequentally by selectng the frst raned features n the smlarty matrx M. They correspond to the set of the most smlar features to the correspondng centrod g. The cluster center s then updated by computng the new centrod: where W features C and g W V (10) 1.., d 1 I s the transform matrx appled to the set of n V of the cluster. n s the cardnalty of the cluster V s the matrx contanng the features vectors n C. Hence, as ths process allows an overlap between clusters, snce features could be assgned to more than one cluster, we dentfy common feature between each par of the obtaned clusters. Each common feature s then assgned to the closest cluster accordng to the Eucldean dstance: 1 2 1, 2 v C C, v C arg mn g v. (11) h Where h s the ether the ndex 1 or 2. Clusters centers are then re-computed usng the current cluster membershps and the process s stopped when all d clusters are constructed. The qualty of extracted feature s quantfed through evaluatng classfcaton accuracy of the transformed data set accordng to a K-fold cross valdaton schema for pattern classfcaton tas and leave one out schema for face recognton tas. 2 45

4 {Input: Raw Data} {Output: cluster centers} dv Compute the sgn of frst order Dervatve sgn dx Compute Proporton pv / Compute matrx of dstance Clusterng: { d ndv: number of clusters : number of preselected features n: ntal number of features Idx: ndex of ntal centrod Whle n > 1 Dst= sort M D, d v, v ;, 1... n Cluster C = select the frst features from Dst Data sets Table 1. Data sets nformaton No. of features No. of nstances No. of classes Intal classfcaton accuracy Sonar Pma Breast cancer Ionosphere A 13-fold cross valdaton schema has been used for the sonar data set and 10-fold cross valdaton was used for the others as n the ref [7] Sonar data set Ths data set was constructed to dscrmnate between the sonar returns bounced off a metal cylnder and those bounced off a roc. It conssts of 208 nstances, wth 60 features and two output classes: mne and roc. In our experment, we used 13-fold cross valdaton n gettng the performances as follows: the 208 nstances are randomly dvded nto 13 sets wth 16 nstances n each. n= n- Update Idx End Intersecton between d fnal clusters} Compute clusters centres g New features= {clusters centres} Fg 1: Pseudo-code for FECM algorthm 3. EXPERIMENTA RESUTS Frst, we have conducted the proposed extracton method (FECM) for some well nown and largely used data sets from UCI machne learnng repostory [15]. Second, we have acheved experments on the face recognton problem through Yale and OR databases. 3.1 Results on UCI Databases We have used 4 datasets: Sonar, Pma, Breast cancer and Ionosphere. Some characterstcs of these data sets are shown n the Table 1. We have conducted conventonal unsupervsed feature extracton method PCA and ICA on these data sets and have compared ther classfcaton performances wth our proposed method FECM for dfferent number of extracted features. We have used Support Vector Machnes SVM [16], as a classfer. No pre-treatment has been done for these bases before conductng feature extracton on them. For SVM, we have used Matlab Toolbox to mplement t. The ernel functon used s Gaussan ernel and s set after varous tests on 10, 1 or Fg 2: Classfcaton accuracy on Sonar Data set For each experment, 12 of these sets are used as tranng data, whle the 13th s reserved for valdaton. The experment s repeated 13 tmes so that every case appears once as part of a test set. For SVM parameters, we have set to 1. The Fgure 2 shows classfcaton accuracy for dfferent number of extracted features. We can see that the performances of FECM are far better than PCA and ICA except for the case of a very low dmenson case of only one extracted feature where ICA outperforms. Snce the concept of our approach s to form groups of smlar features; extractng a very low number of features means gatherng all features n a few numbers of clusters. That can be delcate for some data sets as n ths data set. In Ths case, FECM sn t the most effectve method, nevertheless t stll have better classfcaton accuracy than PCA and better than ICA for hgher dmenson. We can note also that n the case of dmenson 9 and 12, FECM s beyond the ntal error rate of 82% whch s far better than ICA and PCA Pma Indan Dabetes data set It conssts of 768 nstances n whch 500 are class 0 and the others are class 1. It has eght numerc features wth no mssng value. It has numerous nvald data ponts, e.g., features that have a value of 0 even though a value of 0 s not meanngful or physcally possble. These correspond to ponts n feature space that are statstcally dstant from the mean calculated 46

5 usng the remanng data (wth the ponts n queston removed). We removed the ponts n queston from the data set. We have appled PCA, ICA and FECM and compared ther performances as we have done for Sonar data set. In tranng, we have used SVM and the parameter was set to 10. A 10- fold cross valdaton has been used for valdaton process. In followng Fgure 3, classfcaton performances are presented. We can note that the performances of PCA and ICA get closer as the number of extracted features becomes larger. In ths data set, ICA outperforms both PCA and FECM for dfferent numbers of features especally for the lower ones such as dmenson 1 and 2. Fg 4: Classfcaton accuracy on Breast cancer Data set We have used a 10-fold cross valdaton as a strategy of verfcaton. For the SVM classfer, the parameter sgma was set to 10 after we have conducted varous experments. The results show that, wth only one extracted feature, FECM can largely outperform ICA and PCA. Hence, for lager number of extracted features, FECM gets ether smlar or better performance and acheve the best accuracy wth 12 features. Fg 3: Classfcaton accuracy on Pma Data set The proposed method FECM stll perform better than PCA and approaches the performances of ICA for hgher dmenson such as 3 and Wsconsn Breast cancer data set Ths data set conssts of nne attrbutes and two classes; whch are bengn and malgnant. It contans 699 nstances wth 458 bengn and 241 malgnant. There are 16 mssng values and we replaced these wth average values of correspondng attrbutes as n ref [7]. We compared classfcaton performance of our proposed method FECM wth those of ICA and PCA for dfferent number of extracted features. Results are shown n the Fgure 4. We have used a 10-fold cross valdaton as a strategy of verfcaton. For the SVM classfer, the parameter was set to 0.01 after we have conducted varous experments. The results show that, wth only one extracted feature, FECM can get the maxmum classfcaton performance. Hence, for lager number of extracted features, PCA outperform both ICA and FECM Ionosphere data set Ths radar data was collected by a system n Goose Bay, abrador. Ths system conssts of a phased array of 16 hghfrequency antennas. Receved sgnals were processed usng an autocorrelaton functon whose arguments are the tme of a pulse and the pulse number. Instances n ths database are descrbed by 2 attrbutes per pulse number, correspondng to the complex values returned by the functon resultng from the complex electromagnetc sgnal. We compared classfcaton performance of our proposed method FECM wth those of ICA and PCA for dfferent number of extracted features. Results are shown n the Fgure 5. Fg 5: Classfcaton accuracy on Ionosphere Data set 3.2 Results on Face Recognton Problem We have used two datasets: Yale and OR data sets from UCI machne learnng repostory [15]. The Fgure 6 shows some samples from Yale and OR data sets respectvely Yale database It contans 165 GIF mages of 15 subects (subect01, subect02, etc.). There are 11 mages per subect, one for each of the followng facal expressons or confguratons: centerlght, w/glasses, happy, left-lght, w/no glasses, normal, rghtlght, sad, sleepy, surprsed, and wn. The sze of each mage s 320 x 243, composed of pxels. In fact, there are many methods to determne the features of the mage. Fg 6: Samples of Yale and OR databases 47

6 The most ntutve way s to use each pxel as one feature. The nput space becomes too large to be handled. To overcome ths constrant we had to accept the loss of some nformaton. Each mage was reszed to get 783 pxels. The classfcaton performances of extracted features from PCA, ICA and FECM methods were obtaned by the leave-one-out schema. The number of extracted features by ICA s the same as that of PCA (PCA s used as a pre-processor for ICA). In the experments, PCA was ntally conducted on 784 pxels and the frst 30 PCs were used. The ICA was appled to these 30 PCs. Table 2 shows the classfcaton error rates correspondng to each of these methods. The recognton was performed usng the K nearest neghbor classfer (KNN). PCA and ICA produce the same number of features whch s 30, but ICA s slghtly better than PCA n classfcaton accuracy. It can be seen that our proposed approach outperforms PCA and ICA usng only 14 features. Table 2. Classfcaton performance on Yale Data set Methods Error rate (%) No. of (KNN) features PCA ICA FECM KNN OR database The AT&T database of faces conssts of 400 mages, whch are 10 dfferent mages for 40 dstnct ndvduals. It ncludes varous lghtng condtons, facal expressons and facal detals. The mages were cropped nto get 952 pxels for computatonal effcency. The experments were performed exactly the same way as n the Yale database. The results n the Table 3 are from the leave-one-out test wth the one nearest neghbor classfer. The frst 40 PCs are retaned as nput for ICA. PCA and ICA produces smlar results. Our approach s close to PCA and ICA n terms of classfcaton accuracy wth smaller number of features whch s 23. Table 3. Classfcaton performance on OR Data set Methods Error rate No. of features (%) (KNN) PCA ICA FECM KNN CONCUSION Ths wor focuses on developng a general feature extracton approach based on clusterng technque for pattern classfcaton tas and face recognton n partcular. The man motvaton behnd t was to dentfy redundancy n feature space and reduce ts effect wthout losng some mportant nformaton for classfcaton process. Smlar feature are recognzed through analyzng ther tendences along the data set. Although, trend analyss was used to devse the new smlarty measure, t conserves ts ndependency from the order of samples. In the frst stage, the proposed approach FECM extracts feature-clusters as features by applyng clusterng technque, based on the new smlarty measure. In the second stage, these features are used for the classfcaton of patterns. Results obtaned from experments conducted on several data sets, obtaned from UCI machne learnng repostory, showed that ths representaton of patterns mproves over PCA and ICA n almost of cases, especally when proectng to low dmensons. Except for Pma data set, FECM was not able to mprove over other methods n lower dmenson, but gets to smlar results n hgher dmensonal proectons. For the face recognton tas, FECM gets to the lowest dmenson wth the best accuracy for Yale data set. In the case of OR data set, FECM gets the lowest dmenson wth a lower accuracy of 1%. The transformaton appled to the obtaned clusters was a lnear combnaton wth equal weght appled to each of ts correspondng component. However, the transformaton we have to apply s a very mportant step snce t has to determne a representatve feature of each group. It has to preserve the man characterzes of each group of features and ncorporate them nto the new representatve feature. Thus, an approprate transformaton has to be defned n further wor. It would be based on a sophstcated statstcal measure of dependency such as Mutual nformaton. In addton, t would be nterestng to ncorporate label nformaton n the FECM approach for sem-supervsed or supervsed learnng tass. 5. REFERENCES [1] Hld II K. E., Erdogmus D., Torola K. and Prncpe J. C Feature extracton usng nformaton-theoretc learnng. IEEE Trans. on pattern analyss and machne ntellgence, 28 (September 2006), [2] Rpley B.D, Pattern recognton and neural networs. In Cambrgde Unv. Press [3] Guyon I., Elsseeff A An ntroducton to varable and feature selecton. J. Mach. earn. 3(March 2003), [4] Torola K Feature extracton by non-parametrc mutual nformaton maxmzaton. J. Mach. earn. 3 (2003), [5] u X., Tang J., u J., Feng Z., "A sem-supervsed relef based feature extracton algorthm", n second nternatonal conference on future generaton communcaton and networng symposa, [6] Saul. K., Wenberger K. Q., Sha F., Ham J. and ee D. D. Spectral Methods for Dmensonalty Reducton. n O. Chapelle, B. Schoelopf, and A. Zen (eds.), Sem supervsed earnng, MIT Press. Cambrdge, MA [7] Kwa N. and Chong-Ho C "Feature extracton based on ICA for bnary classfcaton problems". IEEE Trans. on nowledge and data engneerng, 15 (November 2003), [8] S. Noam and T. Naftal "The power of word clusters for text classfcaton", 23 rd European colloquum on nformaton retreval research. [9] Baer,. D. and McCallum A. K., "Dstrbutonal clusterng of words for text classfcaton", 21 st Annual nternatonal ACM SIGIR conference on research and development n nformaton retreval. [10] Bonet I., Sayeys Yvan, Grau Abalo R., Garca M.M, Sanchez R., and Van de Peer Y Feature extracton usng clusterng of proten. In proceedngs of the 11 th Iberoamercan congress n pattern recognton CIARP, [11] Fern X.Z., Brodeley C.E Cluster Ensembles for Hgh Dmensonal Clusterng: an emprcal study. Techncal report CS [12] Von uxburg U., Bubec S., Jegela S. and Kaufmann M Consstent mnmzaton of clusterng obectve functons. Neural nformaton processng systems NIPS. 48

7 [13] Charboner S. and Gentl S A trend-based alarm system to mprove patent montorng n ntensve care unts, n Control engneerng practce, 15, [14] Payne T.R. and Edwards P "Implct feature selecton wth the value dfference metrc". 13 th European conference on artfcal ntellgence, pp [15] [16] Cortes C. and Vapnc V Support-vector networs. Machne learnng. 20, pp

Subspace clustering. Clustering. Fundamental to all clustering techniques is the choice of distance measure between data points;

Subspace clustering. Clustering. Fundamental to all clustering techniques is the choice of distance measure between data points; Subspace clusterng Clusterng Fundamental to all clusterng technques s the choce of dstance measure between data ponts; D q ( ) ( ) 2 x x = x x, j k = 1 k jk Squared Eucldean dstance Assumpton: All features

More information

Feature Reduction and Selection

Feature Reduction and Selection Feature Reducton and Selecton Dr. Shuang LIANG School of Software Engneerng TongJ Unversty Fall, 2012 Today s Topcs Introducton Problems of Dmensonalty Feature Reducton Statstc methods Prncpal Components

More information

Cluster Analysis of Electrical Behavior

Cluster Analysis of Electrical Behavior Journal of Computer and Communcatons, 205, 3, 88-93 Publshed Onlne May 205 n ScRes. http://www.scrp.org/ournal/cc http://dx.do.org/0.4236/cc.205.350 Cluster Analyss of Electrcal Behavor Ln Lu Ln Lu, School

More information

Learning the Kernel Parameters in Kernel Minimum Distance Classifier

Learning the Kernel Parameters in Kernel Minimum Distance Classifier Learnng the Kernel Parameters n Kernel Mnmum Dstance Classfer Daoqang Zhang 1,, Songcan Chen and Zh-Hua Zhou 1* 1 Natonal Laboratory for Novel Software Technology Nanjng Unversty, Nanjng 193, Chna Department

More information

Classifier Selection Based on Data Complexity Measures *

Classifier Selection Based on Data Complexity Measures * Classfer Selecton Based on Data Complexty Measures * Edth Hernández-Reyes, J.A. Carrasco-Ochoa, and J.Fco. Martínez-Trndad Natonal Insttute for Astrophyscs, Optcs and Electroncs, Lus Enrque Erro No.1 Sta.

More information

Content Based Image Retrieval Using 2-D Discrete Wavelet with Texture Feature with Different Classifiers

Content Based Image Retrieval Using 2-D Discrete Wavelet with Texture Feature with Different Classifiers IOSR Journal of Electroncs and Communcaton Engneerng (IOSR-JECE) e-issn: 78-834,p- ISSN: 78-8735.Volume 9, Issue, Ver. IV (Mar - Apr. 04), PP 0-07 Content Based Image Retreval Usng -D Dscrete Wavelet wth

More information

The Research of Support Vector Machine in Agricultural Data Classification

The Research of Support Vector Machine in Agricultural Data Classification The Research of Support Vector Machne n Agrcultural Data Classfcaton Le Sh, Qguo Duan, Xnmng Ma, Me Weng College of Informaton and Management Scence, HeNan Agrcultural Unversty, Zhengzhou 45000 Chna Zhengzhou

More information

Support Vector Machines

Support Vector Machines /9/207 MIST.6060 Busness Intellgence and Data Mnng What are Support Vector Machnes? Support Vector Machnes Support Vector Machnes (SVMs) are supervsed learnng technques that analyze data and recognze patterns.

More information

Outline. Discriminative classifiers for image recognition. Where in the World? A nearest neighbor recognition example 4/14/2011. CS 376 Lecture 22 1

Outline. Discriminative classifiers for image recognition. Where in the World? A nearest neighbor recognition example 4/14/2011. CS 376 Lecture 22 1 4/14/011 Outlne Dscrmnatve classfers for mage recognton Wednesday, Aprl 13 Krsten Grauman UT-Austn Last tme: wndow-based generc obect detecton basc ppelne face detecton wth boostng as case study Today:

More information

Discriminative Dictionary Learning with Pairwise Constraints

Discriminative Dictionary Learning with Pairwise Constraints Dscrmnatve Dctonary Learnng wth Parwse Constrants Humn Guo Zhuoln Jang LARRY S. DAVIS UNIVERSITY OF MARYLAND Nov. 6 th, Outlne Introducton/motvaton Dctonary Learnng Dscrmnatve Dctonary Learnng wth Parwse

More information

MULTISPECTRAL IMAGES CLASSIFICATION BASED ON KLT AND ATR AUTOMATIC TARGET RECOGNITION

MULTISPECTRAL IMAGES CLASSIFICATION BASED ON KLT AND ATR AUTOMATIC TARGET RECOGNITION MULTISPECTRAL IMAGES CLASSIFICATION BASED ON KLT AND ATR AUTOMATIC TARGET RECOGNITION Paulo Quntlano 1 & Antono Santa-Rosa 1 Federal Polce Department, Brasla, Brazl. E-mals: quntlano.pqs@dpf.gov.br and

More information

Outline. Type of Machine Learning. Examples of Application. Unsupervised Learning

Outline. Type of Machine Learning. Examples of Application. Unsupervised Learning Outlne Artfcal Intellgence and ts applcatons Lecture 8 Unsupervsed Learnng Professor Danel Yeung danyeung@eee.org Dr. Patrck Chan patrckchan@eee.org South Chna Unversty of Technology, Chna Introducton

More information

Machine Learning: Algorithms and Applications

Machine Learning: Algorithms and Applications 14/05/1 Machne Learnng: Algorthms and Applcatons Florano Zn Free Unversty of Bozen-Bolzano Faculty of Computer Scence Academc Year 011-01 Lecture 10: 14 May 01 Unsupervsed Learnng cont Sldes courtesy of

More information

SLAM Summer School 2006 Practical 2: SLAM using Monocular Vision

SLAM Summer School 2006 Practical 2: SLAM using Monocular Vision SLAM Summer School 2006 Practcal 2: SLAM usng Monocular Vson Javer Cvera, Unversty of Zaragoza Andrew J. Davson, Imperal College London J.M.M Montel, Unversty of Zaragoza. josemar@unzar.es, jcvera@unzar.es,

More information

FEATURE EXTRACTION. Dr. K.Vijayarekha. Associate Dean School of Electrical and Electronics Engineering SASTRA University, Thanjavur

FEATURE EXTRACTION. Dr. K.Vijayarekha. Associate Dean School of Electrical and Electronics Engineering SASTRA University, Thanjavur FEATURE EXTRACTION Dr. K.Vjayarekha Assocate Dean School of Electrcal and Electroncs Engneerng SASTRA Unversty, Thanjavur613 41 Jont Intatve of IITs and IISc Funded by MHRD Page 1 of 8 Table of Contents

More information

LECTURE : MANIFOLD LEARNING

LECTURE : MANIFOLD LEARNING LECTURE : MANIFOLD LEARNING Rta Osadchy Some sldes are due to L.Saul, V. C. Raykar, N. Verma Topcs PCA MDS IsoMap LLE EgenMaps Done! Dmensonalty Reducton Data representaton Inputs are real-valued vectors

More information

Human Face Recognition Using Generalized. Kernel Fisher Discriminant

Human Face Recognition Using Generalized. Kernel Fisher Discriminant Human Face Recognton Usng Generalzed Kernel Fsher Dscrmnant ng-yu Sun,2 De-Shuang Huang Ln Guo. Insttute of Intellgent Machnes, Chnese Academy of Scences, P.O.ox 30, Hefe, Anhu, Chna. 2. Department of

More information

Parallelism for Nested Loops with Non-uniform and Flow Dependences

Parallelism for Nested Loops with Non-uniform and Flow Dependences Parallelsm for Nested Loops wth Non-unform and Flow Dependences Sam-Jn Jeong Dept. of Informaton & Communcaton Engneerng, Cheonan Unversty, 5, Anseo-dong, Cheonan, Chungnam, 330-80, Korea. seong@cheonan.ac.kr

More information

Support Vector Machines

Support Vector Machines Support Vector Machnes Decson surface s a hyperplane (lne n 2D) n feature space (smlar to the Perceptron) Arguably, the most mportant recent dscovery n machne learnng In a nutshell: map the data to a predetermned

More information

Detection of an Object by using Principal Component Analysis

Detection of an Object by using Principal Component Analysis Detecton of an Object by usng Prncpal Component Analyss 1. G. Nagaven, 2. Dr. T. Sreenvasulu Reddy 1. M.Tech, Department of EEE, SVUCE, Trupath, Inda. 2. Assoc. Professor, Department of ECE, SVUCE, Trupath,

More information

Unsupervised Learning

Unsupervised Learning Pattern Recognton Lecture 8 Outlne Introducton Unsupervsed Learnng Parametrc VS Non-Parametrc Approach Mxture of Denstes Maxmum-Lkelhood Estmates Clusterng Prof. Danel Yeung School of Computer Scence and

More information

CS434a/541a: Pattern Recognition Prof. Olga Veksler. Lecture 15

CS434a/541a: Pattern Recognition Prof. Olga Veksler. Lecture 15 CS434a/541a: Pattern Recognton Prof. Olga Veksler Lecture 15 Today New Topc: Unsupervsed Learnng Supervsed vs. unsupervsed learnng Unsupervsed learnng Net Tme: parametrc unsupervsed learnng Today: nonparametrc

More information

A Fast Visual Tracking Algorithm Based on Circle Pixels Matching

A Fast Visual Tracking Algorithm Based on Circle Pixels Matching A Fast Vsual Trackng Algorthm Based on Crcle Pxels Matchng Zhqang Hou hou_zhq@sohu.com Chongzhao Han czhan@mal.xjtu.edu.cn Ln Zheng Abstract: A fast vsual trackng algorthm based on crcle pxels matchng

More information

Lecture 4: Principal components

Lecture 4: Principal components /3/6 Lecture 4: Prncpal components 3..6 Multvarate lnear regresson MLR s optmal for the estmaton data...but poor for handlng collnear data Covarance matrx s not nvertble (large condton number) Robustness

More information

Face Recognition Based on SVM and 2DPCA

Face Recognition Based on SVM and 2DPCA Vol. 4, o. 3, September, 2011 Face Recognton Based on SVM and 2DPCA Tha Hoang Le, Len Bu Faculty of Informaton Technology, HCMC Unversty of Scence Faculty of Informaton Scences and Engneerng, Unversty

More information

NUMERICAL SOLVING OPTIMAL CONTROL PROBLEMS BY THE METHOD OF VARIATIONS

NUMERICAL SOLVING OPTIMAL CONTROL PROBLEMS BY THE METHOD OF VARIATIONS ARPN Journal of Engneerng and Appled Scences 006-017 Asan Research Publshng Network (ARPN). All rghts reserved. NUMERICAL SOLVING OPTIMAL CONTROL PROBLEMS BY THE METHOD OF VARIATIONS Igor Grgoryev, Svetlana

More information

Unsupervised Learning and Clustering

Unsupervised Learning and Clustering Unsupervsed Learnng and Clusterng Why consder unlabeled samples?. Collectng and labelng large set of samples s costly Gettng recorded speech s free, labelng s tme consumng 2. Classfer could be desgned

More information

Two-Dimensional Supervised Discriminant Projection Method For Feature Extraction

Two-Dimensional Supervised Discriminant Projection Method For Feature Extraction Appl. Math. Inf. c. 6 No. pp. 8-85 (0) Appled Mathematcs & Informaton cences An Internatonal Journal @ 0 NP Natural cences Publshng Cor. wo-dmensonal upervsed Dscrmnant Proecton Method For Feature Extracton

More information

Face Recognition University at Buffalo CSE666 Lecture Slides Resources:

Face Recognition University at Buffalo CSE666 Lecture Slides Resources: Face Recognton Unversty at Buffalo CSE666 Lecture Sldes Resources: http://www.face-rec.org/algorthms/ Overvew of face recognton algorthms Correlaton - Pxel based correspondence between two face mages Structural

More information

12/2/2009. Announcements. Parametric / Non-parametric. Case-Based Reasoning. Nearest-Neighbor on Images. Nearest-Neighbor Classification

12/2/2009. Announcements. Parametric / Non-parametric. Case-Based Reasoning. Nearest-Neighbor on Images. Nearest-Neighbor Classification Introducton to Artfcal Intellgence V22.0472-001 Fall 2009 Lecture 24: Nearest-Neghbors & Support Vector Machnes Rob Fergus Dept of Computer Scence, Courant Insttute, NYU Sldes from Danel Yeung, John DeNero

More information

A Fast Content-Based Multimedia Retrieval Technique Using Compressed Data

A Fast Content-Based Multimedia Retrieval Technique Using Compressed Data A Fast Content-Based Multmeda Retreval Technque Usng Compressed Data Borko Furht and Pornvt Saksobhavvat NSF Multmeda Laboratory Florda Atlantc Unversty, Boca Raton, Florda 3343 ABSTRACT In ths paper,

More information

Recognizing Faces. Outline

Recognizing Faces. Outline Recognzng Faces Drk Colbry Outlne Introducton and Motvaton Defnng a feature vector Prncpal Component Analyss Lnear Dscrmnate Analyss !"" #$""% http://www.nfotech.oulu.f/annual/2004 + &'()*) '+)* 2 ! &

More information

Classification / Regression Support Vector Machines

Classification / Regression Support Vector Machines Classfcaton / Regresson Support Vector Machnes Jeff Howbert Introducton to Machne Learnng Wnter 04 Topcs SVM classfers for lnearly separable classes SVM classfers for non-lnearly separable classes SVM

More information

Collaboratively Regularized Nearest Points for Set Based Recognition

Collaboratively Regularized Nearest Points for Set Based Recognition Academc Center for Computng and Meda Studes, Kyoto Unversty Collaboratvely Regularzed Nearest Ponts for Set Based Recognton Yang Wu, Mchhko Mnoh, Masayuk Mukunok Kyoto Unversty 9/1/013 BMVC 013 @ Brstol,

More information

Classification of Face Images Based on Gender using Dimensionality Reduction Techniques and SVM

Classification of Face Images Based on Gender using Dimensionality Reduction Techniques and SVM Classfcaton of Face Images Based on Gender usng Dmensonalty Reducton Technques and SVM Fahm Mannan 260 266 294 School of Computer Scence McGll Unversty Abstract Ths report presents gender classfcaton based

More information

User Authentication Based On Behavioral Mouse Dynamics Biometrics

User Authentication Based On Behavioral Mouse Dynamics Biometrics User Authentcaton Based On Behavoral Mouse Dynamcs Bometrcs Chee-Hyung Yoon Danel Donghyun Km Department of Computer Scence Department of Computer Scence Stanford Unversty Stanford Unversty Stanford, CA

More information

Lecture 5: Multilayer Perceptrons

Lecture 5: Multilayer Perceptrons Lecture 5: Multlayer Perceptrons Roger Grosse 1 Introducton So far, we ve only talked about lnear models: lnear regresson and lnear bnary classfers. We noted that there are functons that can t be represented

More information

A Binarization Algorithm specialized on Document Images and Photos

A Binarization Algorithm specialized on Document Images and Photos A Bnarzaton Algorthm specalzed on Document mages and Photos Ergna Kavalleratou Dept. of nformaton and Communcaton Systems Engneerng Unversty of the Aegean kavalleratou@aegean.gr Abstract n ths paper, a

More information

A PATTERN RECOGNITION APPROACH TO IMAGE SEGMENTATION

A PATTERN RECOGNITION APPROACH TO IMAGE SEGMENTATION 1 THE PUBLISHING HOUSE PROCEEDINGS OF THE ROMANIAN ACADEMY, Seres A, OF THE ROMANIAN ACADEMY Volume 4, Number 2/2003, pp.000-000 A PATTERN RECOGNITION APPROACH TO IMAGE SEGMENTATION Tudor BARBU Insttute

More information

S1 Note. Basis functions.

S1 Note. Basis functions. S1 Note. Bass functons. Contents Types of bass functons...1 The Fourer bass...2 B-splne bass...3 Power and type I error rates wth dfferent numbers of bass functons...4 Table S1. Smulaton results of type

More information

Classifying Acoustic Transient Signals Using Artificial Intelligence

Classifying Acoustic Transient Signals Using Artificial Intelligence Classfyng Acoustc Transent Sgnals Usng Artfcal Intellgence Steve Sutton, Unversty of North Carolna At Wlmngton (suttons@charter.net) Greg Huff, Unversty of North Carolna At Wlmngton (jgh7476@uncwl.edu)

More information

Smoothing Spline ANOVA for variable screening

Smoothing Spline ANOVA for variable screening Smoothng Splne ANOVA for varable screenng a useful tool for metamodels tranng and mult-objectve optmzaton L. Rcco, E. Rgon, A. Turco Outlne RSM Introducton Possble couplng Test case MOO MOO wth Game Theory

More information

Machine Learning. Topic 6: Clustering

Machine Learning. Topic 6: Clustering Machne Learnng Topc 6: lusterng lusterng Groupng data nto (hopefully useful) sets. Thngs on the left Thngs on the rght Applcatons of lusterng Hypothess Generaton lusters mght suggest natural groups. Hypothess

More information

An Entropy-Based Approach to Integrated Information Needs Assessment

An Entropy-Based Approach to Integrated Information Needs Assessment Dstrbuton Statement A: Approved for publc release; dstrbuton s unlmted. An Entropy-Based Approach to ntegrated nformaton Needs Assessment June 8, 2004 Wllam J. Farrell Lockheed Martn Advanced Technology

More information

Hierarchical clustering for gene expression data analysis

Hierarchical clustering for gene expression data analysis Herarchcal clusterng for gene expresson data analyss Gorgo Valentn e-mal: valentn@ds.unm.t Clusterng of Mcroarray Data. Clusterng of gene expresson profles (rows) => dscovery of co-regulated and functonally

More information

Edge Detection in Noisy Images Using the Support Vector Machines

Edge Detection in Noisy Images Using the Support Vector Machines Edge Detecton n Nosy Images Usng the Support Vector Machnes Hlaro Gómez-Moreno, Saturnno Maldonado-Bascón, Francsco López-Ferreras Sgnal Theory and Communcatons Department. Unversty of Alcalá Crta. Madrd-Barcelona

More information

Announcements. Supervised Learning

Announcements. Supervised Learning Announcements See Chapter 5 of Duda, Hart, and Stork. Tutoral by Burge lnked to on web page. Supervsed Learnng Classfcaton wth labeled eamples. Images vectors n hgh-d space. Supervsed Learnng Labeled eamples

More information

BOOSTING CLASSIFICATION ACCURACY WITH SAMPLES CHOSEN FROM A VALIDATION SET

BOOSTING CLASSIFICATION ACCURACY WITH SAMPLES CHOSEN FROM A VALIDATION SET 1 BOOSTING CLASSIFICATION ACCURACY WITH SAMPLES CHOSEN FROM A VALIDATION SET TZU-CHENG CHUANG School of Electrcal and Computer Engneerng, Purdue Unversty, West Lafayette, Indana 47907 SAUL B. GELFAND School

More information

Discriminative classifiers for object classification. Last time

Discriminative classifiers for object classification. Last time Dscrmnatve classfers for object classfcaton Thursday, Nov 12 Krsten Grauman UT Austn Last tme Supervsed classfcaton Loss and rsk, kbayes rule Skn color detecton example Sldng ndo detecton Classfers, boostng

More information

Improvement of Spatial Resolution Using BlockMatching Based Motion Estimation and Frame. Integration

Improvement of Spatial Resolution Using BlockMatching Based Motion Estimation and Frame. Integration Improvement of Spatal Resoluton Usng BlockMatchng Based Moton Estmaton and Frame Integraton Danya Suga and Takayuk Hamamoto Graduate School of Engneerng, Tokyo Unversty of Scence, 6-3-1, Nuku, Katsuska-ku,

More information

Fuzzy Modeling of the Complexity vs. Accuracy Trade-off in a Sequential Two-Stage Multi-Classifier System

Fuzzy Modeling of the Complexity vs. Accuracy Trade-off in a Sequential Two-Stage Multi-Classifier System Fuzzy Modelng of the Complexty vs. Accuracy Trade-off n a Sequental Two-Stage Mult-Classfer System MARK LAST 1 Department of Informaton Systems Engneerng Ben-Guron Unversty of the Negev Beer-Sheva 84105

More information

Machine Learning 9. week

Machine Learning 9. week Machne Learnng 9. week Mappng Concept Radal Bass Functons (RBF) RBF Networks 1 Mappng It s probably the best scenaro for the classfcaton of two dataset s to separate them lnearly. As you see n the below

More information

An Evolvable Clustering Based Algorithm to Learn Distance Function for Supervised Environment

An Evolvable Clustering Based Algorithm to Learn Distance Function for Supervised Environment IJCSI Internatonal Journal of Computer Scence Issues, Vol. 7, Issue 5, September 2010 ISSN (Onlne): 1694-0814 www.ijcsi.org 374 An Evolvable Clusterng Based Algorthm to Learn Dstance Functon for Supervsed

More information

Incremental Learning with Support Vector Machines and Fuzzy Set Theory

Incremental Learning with Support Vector Machines and Fuzzy Set Theory The 25th Workshop on Combnatoral Mathematcs and Computaton Theory Incremental Learnng wth Support Vector Machnes and Fuzzy Set Theory Yu-Mng Chuang 1 and Cha-Hwa Ln 2* 1 Department of Computer Scence and

More information

Correlative features for the classification of textural images

Correlative features for the classification of textural images Correlatve features for the classfcaton of textural mages M A Turkova 1 and A V Gadel 1, 1 Samara Natonal Research Unversty, Moskovskoe Shosse 34, Samara, Russa, 443086 Image Processng Systems Insttute

More information

R s s f. m y s. SPH3UW Unit 7.3 Spherical Concave Mirrors Page 1 of 12. Notes

R s s f. m y s. SPH3UW Unit 7.3 Spherical Concave Mirrors Page 1 of 12. Notes SPH3UW Unt 7.3 Sphercal Concave Mrrors Page 1 of 1 Notes Physcs Tool box Concave Mrror If the reflectng surface takes place on the nner surface of the sphercal shape so that the centre of the mrror bulges

More information

RECOGNIZING GENDER THROUGH FACIAL IMAGE USING SUPPORT VECTOR MACHINE

RECOGNIZING GENDER THROUGH FACIAL IMAGE USING SUPPORT VECTOR MACHINE Journal of Theoretcal and Appled Informaton Technology 30 th June 06. Vol.88. No.3 005-06 JATIT & LLS. All rghts reserved. ISSN: 99-8645 www.jatt.org E-ISSN: 87-395 RECOGNIZING GENDER THROUGH FACIAL IMAGE

More information

Intelligent Information Acquisition for Improved Clustering

Intelligent Information Acquisition for Improved Clustering Intellgent Informaton Acquston for Improved Clusterng Duy Vu Unversty of Texas at Austn duyvu@cs.utexas.edu Mkhal Blenko Mcrosoft Research mblenko@mcrosoft.com Prem Melvlle IBM T.J. Watson Research Center

More information

Learning-Based Top-N Selection Query Evaluation over Relational Databases

Learning-Based Top-N Selection Query Evaluation over Relational Databases Learnng-Based Top-N Selecton Query Evaluaton over Relatonal Databases Lang Zhu *, Wey Meng ** * School of Mathematcs and Computer Scence, Hebe Unversty, Baodng, Hebe 071002, Chna, zhu@mal.hbu.edu.cn **

More information

Learning a Class-Specific Dictionary for Facial Expression Recognition

Learning a Class-Specific Dictionary for Facial Expression Recognition BULGARIAN ACADEMY OF SCIENCES CYBERNETICS AND INFORMATION TECHNOLOGIES Volume 16, No 4 Sofa 016 Prnt ISSN: 1311-970; Onlne ISSN: 1314-4081 DOI: 10.1515/cat-016-0067 Learnng a Class-Specfc Dctonary for

More information

Face Recognition Method Based on Within-class Clustering SVM

Face Recognition Method Based on Within-class Clustering SVM Face Recognton Method Based on Wthn-class Clusterng SVM Yan Wu, Xao Yao and Yng Xa Department of Computer Scence and Engneerng Tong Unversty Shangha, Chna Abstract - A face recognton method based on Wthn-class

More information

Solving two-person zero-sum game by Matlab

Solving two-person zero-sum game by Matlab Appled Mechancs and Materals Onlne: 2011-02-02 ISSN: 1662-7482, Vols. 50-51, pp 262-265 do:10.4028/www.scentfc.net/amm.50-51.262 2011 Trans Tech Publcatons, Swtzerland Solvng two-person zero-sum game by

More information

Outline. Self-Organizing Maps (SOM) US Hebbian Learning, Cntd. The learning rule is Hebbian like:

Outline. Self-Organizing Maps (SOM) US Hebbian Learning, Cntd. The learning rule is Hebbian like: Self-Organzng Maps (SOM) Turgay İBRİKÇİ, PhD. Outlne Introducton Structures of SOM SOM Archtecture Neghborhoods SOM Algorthm Examples Summary 1 2 Unsupervsed Hebban Learnng US Hebban Learnng, Cntd 3 A

More information

Modular PCA Face Recognition Based on Weighted Average

Modular PCA Face Recognition Based on Weighted Average odern Appled Scence odular PCA Face Recognton Based on Weghted Average Chengmao Han (Correspondng author) Department of athematcs, Lny Normal Unversty Lny 76005, Chna E-mal: hanchengmao@163.com Abstract

More information

Appearance-based Statistical Methods for Face Recognition

Appearance-based Statistical Methods for Face Recognition 47th Internatonal Symposum ELMAR-2005, 08-10 June 2005, Zadar, Croata Appearance-based Statstcal Methods for Face Recognton Kresmr Delac 1, Mslav Grgc 2, Panos Latss 3 1 Croatan elecom, Savsa 32, Zagreb,

More information

Backpropagation: In Search of Performance Parameters

Backpropagation: In Search of Performance Parameters Bacpropagaton: In Search of Performance Parameters ANIL KUMAR ENUMULAPALLY, LINGGUO BU, and KHOSROW KAIKHAH, Ph.D. Computer Scence Department Texas State Unversty-San Marcos San Marcos, TX-78666 USA ae049@txstate.edu,

More information

Data Mining: Model Evaluation

Data Mining: Model Evaluation Data Mnng: Model Evaluaton Aprl 16, 2013 1 Issues: Evaluatng Classfcaton Methods Accurac classfer accurac: predctng class label predctor accurac: guessng value of predcted attrbutes Speed tme to construct

More information

An Optimal Algorithm for Prufer Codes *

An Optimal Algorithm for Prufer Codes * J. Software Engneerng & Applcatons, 2009, 2: 111-115 do:10.4236/jsea.2009.22016 Publshed Onlne July 2009 (www.scrp.org/journal/jsea) An Optmal Algorthm for Prufer Codes * Xaodong Wang 1, 2, Le Wang 3,

More information

PRÉSENTATIONS DE PROJETS

PRÉSENTATIONS DE PROJETS PRÉSENTATIONS DE PROJETS Rex Onlne (V. Atanasu) What s Rex? Rex s an onlne browser for collectons of wrtten documents [1]. Asde ths core functon t has however many other applcatons that make t nterestng

More information

Fitting & Matching. Lecture 4 Prof. Bregler. Slides from: S. Lazebnik, S. Seitz, M. Pollefeys, A. Effros.

Fitting & Matching. Lecture 4 Prof. Bregler. Slides from: S. Lazebnik, S. Seitz, M. Pollefeys, A. Effros. Fttng & Matchng Lecture 4 Prof. Bregler Sldes from: S. Lazebnk, S. Setz, M. Pollefeys, A. Effros. How do we buld panorama? We need to match (algn) mages Matchng wth Features Detect feature ponts n both

More information

Understanding K-Means Non-hierarchical Clustering

Understanding K-Means Non-hierarchical Clustering SUNY Albany - Techncal Report 0- Understandng K-Means Non-herarchcal Clusterng Ian Davdson State Unversty of New York, 1400 Washngton Ave., Albany, 105. DAVIDSON@CS.ALBANY.EDU Abstract The K-means algorthm

More information

Helsinki University Of Technology, Systems Analysis Laboratory Mat Independent research projects in applied mathematics (3 cr)

Helsinki University Of Technology, Systems Analysis Laboratory Mat Independent research projects in applied mathematics (3 cr) Helsnk Unversty Of Technology, Systems Analyss Laboratory Mat-2.08 Independent research projects n appled mathematcs (3 cr) "! #$&% Antt Laukkanen 506 R ajlaukka@cc.hut.f 2 Introducton...3 2 Multattrbute

More information

Parallel matrix-vector multiplication

Parallel matrix-vector multiplication Appendx A Parallel matrx-vector multplcaton The reduced transton matrx of the three-dmensonal cage model for gel electrophoress, descrbed n secton 3.2, becomes excessvely large for polymer lengths more

More information

Image Representation & Visualization Basic Imaging Algorithms Shape Representation and Analysis. outline

Image Representation & Visualization Basic Imaging Algorithms Shape Representation and Analysis. outline mage Vsualzaton mage Vsualzaton mage Representaton & Vsualzaton Basc magng Algorthms Shape Representaton and Analyss outlne mage Representaton & Vsualzaton Basc magng Algorthms Shape Representaton and

More information

TN348: Openlab Module - Colocalization

TN348: Openlab Module - Colocalization TN348: Openlab Module - Colocalzaton Topc The Colocalzaton module provdes the faclty to vsualze and quantfy colocalzaton between pars of mages. The Colocalzaton wndow contans a prevew of the two mages

More information

A Genetic Programming-PCA Hybrid Face Recognition Algorithm

A Genetic Programming-PCA Hybrid Face Recognition Algorithm Journal of Sgnal and Informaton Processng, 20, 2, 70-74 do:0.4236/jsp.20.23022 Publshed Onlne August 20 (http://www.scrp.org/journal/jsp) A Genetc Programmng-PCA Hybrd Face Recognton Algorthm Behzad Bozorgtabar,

More information

Unsupervised Learning and Clustering

Unsupervised Learning and Clustering Unsupervsed Learnng and Clusterng Supervsed vs. Unsupervsed Learnng Up to now we consdered supervsed learnng scenaro, where we are gven 1. samples 1,, n 2. class labels for all samples 1,, n Ths s also

More information

Graph-based Clustering

Graph-based Clustering Graphbased Clusterng Transform the data nto a graph representaton ertces are the data ponts to be clustered Edges are eghted based on smlarty beteen data ponts Graph parttonng Þ Each connected component

More information

Course Introduction. Algorithm 8/31/2017. COSC 320 Advanced Data Structures and Algorithms. COSC 320 Advanced Data Structures and Algorithms

Course Introduction. Algorithm 8/31/2017. COSC 320 Advanced Data Structures and Algorithms. COSC 320 Advanced Data Structures and Algorithms Course Introducton Course Topcs Exams, abs, Proects A quc loo at a few algorthms 1 Advanced Data Structures and Algorthms Descrpton: We are gong to dscuss algorthm complexty analyss, algorthm desgn technques

More information

CS246: Mining Massive Datasets Jure Leskovec, Stanford University

CS246: Mining Massive Datasets Jure Leskovec, Stanford University CS46: Mnng Massve Datasets Jure Leskovec, Stanford Unversty http://cs46.stanford.edu /19/013 Jure Leskovec, Stanford CS46: Mnng Massve Datasets, http://cs46.stanford.edu Perceptron: y = sgn( x Ho to fnd

More information

Associative Based Classification Algorithm For Diabetes Disease Prediction

Associative Based Classification Algorithm For Diabetes Disease Prediction Internatonal Journal of Engneerng Trends and Technology (IJETT) Volume-41 Number-3 - November 016 Assocatve Based Classfcaton Algorthm For Dabetes Dsease Predcton 1 N. Gnana Deepka, Y.surekha, 3 G.Laltha

More information

Face Recognition using 3D Directional Corner Points

Face Recognition using 3D Directional Corner Points 2014 22nd Internatonal Conference on Pattern Recognton Face Recognton usng 3D Drectonal Corner Ponts Xun Yu, Yongsheng Gao School of Engneerng Grffth Unversty Nathan, QLD, Australa xun.yu@grffthun.edu.au,

More information

Local Quaternary Patterns and Feature Local Quaternary Patterns

Local Quaternary Patterns and Feature Local Quaternary Patterns Local Quaternary Patterns and Feature Local Quaternary Patterns Jayu Gu and Chengjun Lu The Department of Computer Scence, New Jersey Insttute of Technology, Newark, NJ 0102, USA Abstract - Ths paper presents

More information

KOHONEN'S SELF ORGANIZING NETWORKS WITH "CONSCIENCE"

KOHONEN'S SELF ORGANIZING NETWORKS WITH CONSCIENCE Kohonen's Self Organzng Maps and ther use n Interpretaton, Dr. M. Turhan (Tury) Taner, Rock Sold Images Page: 1 KOHONEN'S SELF ORGANIZING NETWORKS WITH "CONSCIENCE" By: Dr. M. Turhan (Tury) Taner, Rock

More information

Query Clustering Using a Hybrid Query Similarity Measure

Query Clustering Using a Hybrid Query Similarity Measure Query clusterng usng a hybrd query smlarty measure Fu. L., Goh, D.H., & Foo, S. (2004). WSEAS Transacton on Computers, 3(3), 700-705. Query Clusterng Usng a Hybrd Query Smlarty Measure Ln Fu, Don Hoe-Lan

More information

Laplacian Eigenmap for Image Retrieval

Laplacian Eigenmap for Image Retrieval Laplacan Egenmap for Image Retreval Xaofe He Partha Nyog Department of Computer Scence The Unversty of Chcago, 1100 E 58 th Street, Chcago, IL 60637 ABSTRACT Dmensonalty reducton has been receved much

More information

CS 534: Computer Vision Model Fitting

CS 534: Computer Vision Model Fitting CS 534: Computer Vson Model Fttng Sprng 004 Ahmed Elgammal Dept of Computer Scence CS 534 Model Fttng - 1 Outlnes Model fttng s mportant Least-squares fttng Maxmum lkelhood estmaton MAP estmaton Robust

More information

An Anti-Noise Text Categorization Method based on Support Vector Machines *

An Anti-Noise Text Categorization Method based on Support Vector Machines * An Ant-Nose Text ategorzaton Method based on Support Vector Machnes * hen Ln, Huang Je and Gong Zheng-Hu School of omputer Scence, Natonal Unversty of Defense Technology, hangsha, 410073, hna chenln@nudt.edu.cn,

More information

Facial Expression Recognition Based on Local Binary Patterns and Local Fisher Discriminant Analysis

Facial Expression Recognition Based on Local Binary Patterns and Local Fisher Discriminant Analysis WSEAS RANSACIONS on SIGNAL PROCESSING Shqng Zhang, Xaomng Zhao, Bcheng Le Facal Expresson Recognton Based on Local Bnary Patterns and Local Fsher Dscrmnant Analyss SHIQING ZHANG, XIAOMING ZHAO, BICHENG

More information

Determining the Optimal Bandwidth Based on Multi-criterion Fusion

Determining the Optimal Bandwidth Based on Multi-criterion Fusion Proceedngs of 01 4th Internatonal Conference on Machne Learnng and Computng IPCSIT vol. 5 (01) (01) IACSIT Press, Sngapore Determnng the Optmal Bandwdth Based on Mult-crteron Fuson Ha-L Lang 1+, Xan-Mn

More information

An Image Fusion Approach Based on Segmentation Region

An Image Fusion Approach Based on Segmentation Region Rong Wang, L-Qun Gao, Shu Yang, Yu-Hua Cha, and Yan-Chun Lu An Image Fuson Approach Based On Segmentaton Regon An Image Fuson Approach Based on Segmentaton Regon Rong Wang, L-Qun Gao, Shu Yang 3, Yu-Hua

More information

An Improved Spectral Clustering Algorithm Based on Local Neighbors in Kernel Space 1

An Improved Spectral Clustering Algorithm Based on Local Neighbors in Kernel Space 1 DOI: 10.98/CSIS110415064L An Improved Spectral Clusterng Algorthm Based on Local Neghbors n Kernel Space 1 Xnyue Lu 1,, Xng Yong and Hongfe Ln 1 1 School of Computer Scence and Technology, Dalan Unversty

More information

Module Management Tool in Software Development Organizations

Module Management Tool in Software Development Organizations Journal of Computer Scence (5): 8-, 7 ISSN 59-66 7 Scence Publcatons Management Tool n Software Development Organzatons Ahmad A. Al-Rababah and Mohammad A. Al-Rababah Faculty of IT, Al-Ahlyyah Amman Unversty,

More information

A Clustering Algorithm for Key Frame Extraction Based on Density Peak

A Clustering Algorithm for Key Frame Extraction Based on Density Peak Journal of Computer and Communcatons, 2018, 6, 118-128 http://www.scrp.org/ournal/cc ISSN Onlne: 2327-5227 ISSN Prnt: 2327-5219 A Clusterng Algorthm for Key Frame Extracton Based on Densty Peak Hong Zhao

More information

Skew Angle Estimation and Correction of Hand Written, Textual and Large areas of Non-Textual Document Images: A Novel Approach

Skew Angle Estimation and Correction of Hand Written, Textual and Large areas of Non-Textual Document Images: A Novel Approach Angle Estmaton and Correcton of Hand Wrtten, Textual and Large areas of Non-Textual Document Images: A Novel Approach D.R.Ramesh Babu Pyush M Kumat Mahesh D Dhannawat PES Insttute of Technology Research

More information

Sum of Linear and Fractional Multiobjective Programming Problem under Fuzzy Rules Constraints

Sum of Linear and Fractional Multiobjective Programming Problem under Fuzzy Rules Constraints Australan Journal of Basc and Appled Scences, 2(4): 1204-1208, 2008 ISSN 1991-8178 Sum of Lnear and Fractonal Multobjectve Programmng Problem under Fuzzy Rules Constrants 1 2 Sanjay Jan and Kalash Lachhwan

More information

General Vector Machine. Hong Zhao Department of Physics, Xiamen University

General Vector Machine. Hong Zhao Department of Physics, Xiamen University General Vector Machne Hong Zhao (zhaoh@xmu.edu.cn) Department of Physcs, Xamen Unversty The support vector machne (SVM) s an mportant class of learnng machnes for functon approach, pattern recognton, and

More information

A Multi-step Strategy for Shape Similarity Search In Kamon Image Database

A Multi-step Strategy for Shape Similarity Search In Kamon Image Database A Mult-step Strategy for Shape Smlarty Search In Kamon Image Database Paul W.H. Kwan, Kazuo Torach 2, Kesuke Kameyama 2, Junbn Gao 3, Nobuyuk Otsu 4 School of Mathematcs, Statstcs and Computer Scence,

More information

Programming in Fortran 90 : 2017/2018

Programming in Fortran 90 : 2017/2018 Programmng n Fortran 90 : 2017/2018 Programmng n Fortran 90 : 2017/2018 Exercse 1 : Evaluaton of functon dependng on nput Wrte a program who evaluate the functon f (x,y) for any two user specfed values

More information

A Robust Method for Estimating the Fundamental Matrix

A Robust Method for Estimating the Fundamental Matrix Proc. VIIth Dgtal Image Computng: Technques and Applcatons, Sun C., Talbot H., Ourseln S. and Adraansen T. (Eds.), 0- Dec. 003, Sydney A Robust Method for Estmatng the Fundamental Matrx C.L. Feng and Y.S.

More information