Multilabel Classification with Meta-level Features

Size: px
Start display at page:

Download "Multilabel Classification with Meta-level Features"

Transcription

1 Multlabel Classfaton wth Meta-level Features Sddharth Gopal Carnege Mellon Unversty Pttsburgh PA 523 Ymng Yang Carnege Mellon Unversty Pttsburgh PA 523 ABSTRACT Effetve learnng n mult-label lassfaton (MLC) requres an approprate level of abstraton for representng the relatonshp between eah nstane and multple ategores. Current MLC methods have been foused on learnng-to-map from nstanes to ranked lsts of ategores n a relatvely hgh-dmensonal spae. The fne-graned features n suh a spae may not be suffently expressve for haraterzng dsrmnatve patterns, and worse, make the model omplexty unneessarly hgh. Ths paper proposes an alternatve approah by transformng onventonal representatons of nstanes and ategores nto a relatvely small set of lnk-based meta-level features, and leveragng suessful learnng-to-rank retreval algorthms (e.g., SVM-MAP) over ths redued feature spae. Controlled experments on multple benhmark datasets show strong empral evdene for the strength of the proposed approah, as t sgnfantly outperformed several state-of-the-art methods, nludng Rank-SVM, ML-kNN and IBLR-ML (Instane-based Logst Regresson for Mult-label Classfaton) n most ases. Categores and Subet Desrptors I.5.2 [PATTERN RECOGNITION] Desgn Methodology; Classfer desgn and evaluaton; H..0 [INFORMATION SYSTEMS]: General; General Terms Algorthms, Desgn, Expermentaton, Performane Keywords Mult-label lassfaton; model desgn; learnng to rank; omparatve evaluaton. INTRODUCTION Mult-label lassfaton (MLC) refers to the problem of nstane labelng where eah nstane may have more than one orret label. MLC has a broad range of applatons. For example, a news artle ould belong to both tops polts and eonoms, and also ould be related to Chna and USA as the regonal ategores. An mage (pture) ould have flower as the obet type, yellow and red as the Permsson to make dgtal or hard opes of all or part of ths work for personal or lassroom use s granted wthout fee provded that opes are not made or dstrbuted for proft or ommeral advantage and that opes bear ths note and the full taton on the frst page. To opy otherwse, or republsh, to post on servers or to redstrbute to lsts, requres pror spef permsson and/or a fee. SIGIR 0, July 9 23, 200, Geneva, Swtzerland. Copyrght 200 ACM /0/07 $0.00. olors, and outdoor as the bakground ategory. A omputer trouble report ould be smultaneously related to a hardware falure, a software problem, an urgeny-level ategory, a regonal ode, and so on. MLC s tehnally hallengng as t goes beyond the sope of well-studed two-way lassfers, suh as bnary Support Vetor Mahnes (SVM), Naïve Bayes probablst lassfers, et. Approahes to MLC typally redue the problem nto two subproblems: the frst s learnng to rank ategores wth respet to eah nput nstane, and the seond s learnng to plae a threshold on eah ranked lst for a yes/no deson per ategory. The frst subproblem s the most hallengng part and therefore has been the entral fous n MLC. A varety of approahes has been developed and an be roughly dvded nto two types: bnary-lassfer based methods versus global optmzaton methods, and the latter an be further dvded nto model-based and nstane-based methods. Bnary-lassfer based methods are the smplest. A representatve example s to use a standard SVM ( bnary-svm ) [7] to learn a sorng funton for eah ategory ndependently from the sorng funtons for other ategores. Other knds of bnary lassfers ould also be used for suh a purpose, suh as logst regresson, Naïve Bayes probablst lassfers, boostng algorthms, neural networks et. In the testng phase, the ranked lst of ategores s obtaned for eah test nstane by sorng eah ategory ndependently and then sortng the sores. Bnarylassfer based methods have been ommonly used due to ther smplty, but also have been rtzed for the lak of global optmzaton n ategory sorng. These methods are ommon baselnes n benhmark evaluatons of stronger methods n MLC. Elsseeff and Weston [5] proposed a large-margn approah, Rank-SVM, whh s a representatve example of model-based methods. Unlke onventonal bnary SVM whh maxmzes the margn for eah ategory ndependently, Rank-SVM maxmzes the sum of the margns for all ategores smultaneously, ondtoned on partal-order onstrants. That s, rankng the relevant ategores of eah tranng nstane hgher than the rrelevant ategores s an explt optmzaton rteron n ths method. The sorng funton s parameterzed by the weghts of nput features for every ategory; thus, the number of parameters n Rank-SVM s the produt of the number of ategores and the sze of the feature set. In other words, the model omplexty (measured usng the number of free parameters) s the same as that of m bnary-svms where m s the number of target ategores. Experments by the authors of Rank- SVM show that ths method sgnfantly outperformed bnary SVM n gene lassfaton on a mro-array dataset (namely the yeast dataset ). Zhang and Zhou [25] proposed an nstane-based approah whh s named as Mult-label k-nearest Neghbor (ML-kNN). Cheng and Hullermeer proposed another varant alled Instane-

2 2 Based Logst Regresson (IBLR) [3]. Mult-label versons of knn have been studed n text ategorzaton for a long tme and ommonly used as strong baselnes n benhmark evaluatons [2][20][22]. ML-kNN and IBLR are relatvely new varants whh are smlar n the sense that both use Euldean dstane to dentfy the k nearest neghbors for eah test nstane, but dffer n how the loal probabltes are estmated for ategores. ML-kNN gves an equal weght to eah label ourrene n the neghborhood of the nput nstane whle IBLR vares the weght of eah label aordng to how dstant t s to the test nstane. Further, ML-kNN assumes ndependene among ategory ourrenes whle IBLR expltly models parwse dependenes among ategores usng logst regresson. IBLR also makes ombned use of nstane-based features (suh as the smlarty sore of eah neghbor) and onventonal features (suh as words n the test doument) n the logst regresson. The evaluatons by the authors of ML-kNN showed ts superor performane over Rank-SVM, BoosTexter [4] and Adaboost.MH [5], and the experments by the authors of IBLR showed further performane mprovement by IBLR over the results of ML-kNN on multple datasets [3]. MLC methods, nludng the ones dsussed above, have been foused on learnng-to-rank n a relatvely hgh-dmensonal spae. In Rank-SVM for text ategorzaton the features are words n the tranng-set doument voabulary. In IBLR for text ategorzaton the feature set s the unon of word-level features and knn-based features (used to model the nterdependenes among ategores). Both methods learn a lnear funton per ategory, so the total number of features s md where d s the feature-set sze and m s the number of ategores. In mahne learnng, t s generally understood that when the number of model parameters s unneessarly large, the model tends to overft tranng data and does not generalze well on test data. To what extent would ths be an ssue n urrent MLC methods? Further, an we fnd a better soluton for MLC by transformng lower-level features to hgher-level ones, and then learnng to rank more effetvely n the redued spae? Thorough nvestgaton s needed to answer these questons, and suh s the prmary fous of ths paper. Spefally, we address these questons by the followng means: ) We propose a gener framework that allows automated transformaton of a onventonal data representaton (suh as a bag of words per nstane and a set of tranng nstanes per ategory) nto meta-level features. Ths enables a broad range of learnng-to-rank algorthms n nformaton retreval (IR) to be deployed for rankng ategores n MLC. 2) We use SVM-MAP, a large-margn method for optmzng ranked lsts of douments wth respet to the Mean Average Preson n IR, as the hoe of algorthm n ths paper to llustrate effetve MLC learnng wth meta-level features. For onvenene we all ths nstantaton of our framework SVM-MAP-MLC n dstnton from the use of SVM-MAP n ad-ho nformaton retreval. 3) We ondut ontrolled experments on multple benhmark datasets to evaluate SVM-MAP-MLC n omparson wth other state-of-the-art methods, nludng Rank-SVM, MLkNN, IBLR and Bnary SVM. 4) We provde strong empral evdene for the strengths of the proposed method, wth p-values at the % level or smaller on all the datasets n statstal sgnfane tests for omparng our approah wth the other state-of-the-art methods. The rest of the paper s organzed as follows. Seton 2 ntrodues the meta-level features for MLC. Seton 3 desrbes the method for ategory rankng optmzaton,.e., SVM-MAP- MLC and the strategy for optmzng the threshold on eah ranked lst. Seton 4 presents desgn of ontrolled experments. Seton 5 reports the man results. Seton 6 summarzes our fndngs and onluson. 2. META-LEVEL FEATURES FOR MLC Followng a standard notaton n mahne learnng, we defne X as the nput spae (of all possble nstanes), Y as the output spae (of all possble ranked lsts of target varables), and T {( x, y ), ( x 2, y 2),,( x n, y n)} as a tranng set of n pars of x X and y Y. The learnng-to-rank problem s generally defned as to fnd the optmal mappng f : X Y gven T. Now we defne a transformaton from x X to z Z wth the followng propertes: ) The feature-set sze of Z should be relatve small. 2) The features n z Z should be hghly nformatve about how eah nstane s related to multple ategores, and should dsrmnate relevant nstanes from rrelevant nstanes wth respet to any possble subset of ategores. 3) The transformaton from x X to z Z should be learnable based on a labeled tranng set and automatally omputable for eah test nstane on the fly. 4) The transformed tranng data should allow a broad range of learnng-to-rank algorthms to be used for MLC optmzaton wth respet to varous loss funtons. Based on these rtera we defne vetor z as the onatenaton of the sub-vetors below, whh are defned for eah test nstane x 0 and ategory C for,2,, m as: v x, ) d ( x, x ), d( x, x ),, d( x, x ) L ( k, a k- dmensonal vetor where x knn x, ) s the th ( 0 nearest neghbor (,2,, k ) of x 0 among the members of n the tranng set, and d( x, x0 ) x x0 2 s the L -norm of the dfferene between the two vetors ; 2 v x, ) d '( x, x ), d'( x, x ),, d'( x, x ) L ( k 0, a k- dmensonal vetor where d' ( x, x0 ) x x0 s the L -norm of the dfferene between the two vetors; For rare ategores, sne the number of tranng nstanes n eah ategory s small, there mght not be k nearest neghbors. In suh a ase we duplate the furthest neghbor so that the subvetor reahes sze k.

3 3 v x, ) os( x, x ),os( x, x ),,os( x, x ) os( k 0 k-dmensonal vetor whose elements are the osne smlarty of the orrespondng vetor pars; v x, ) d ( x, x ),os( x, x )) mod ( vetor where, a, a 2-dmensonal x s the entrod (vetor average) of all the postve tranng examples n ategory. Of ourse the lsted features are not neessarly exhaustve for all possbly nformatve features but rather a set of onrete examples for llustratng the prnple n our approah. The number of features n vetor z s 3k 2 per ategory, and the total of ( 3k 2) m for m ategores. Parameter k an be tuned on a held-out valdaton dataset whh s typally n the range from 0 to 00. The sze of suh a meta-level feature set s muh smaller than those typally found n urrent MLC methods. More mportantly, these meta-level features are ndued n a supervsed manner, takng the labels of tranng nstanes nto aount. Also, the meta-level features make a ombned use of loal nformaton (through knn-based features) and global nformaton (through ategory entrods) n the tranng set. Vetor z Z synthetally represents a pattern for eah nstane about how t s related to multple ategores. Fgure llustrates the onept geometrally n a 2-D spae. For smplty we only plot the L2-norm lnks n ths graph and omt the other types of lnks. Fgure : The lnk-based representaton of one partular nstane (the dot n the enter) n relaton to multple ategores s shown. Eah ategory s represented usng ts postve examples (ponts n the same olor) n the tranng set and ts entrod (trangle). Eah nstane s represented usng the average lnks (.e., the dstane to eah ategory entrod, shown by thk lnes) and multple sngle lnks (.e., the dstane to eah of the k nearest neghbors, shown by the thn lnes) per ategory. These lnks together portray an nformatve pture about how the nstane s related to multple ategores. 3. OUR APPROACH: SVM-MAP-MLC 3. Learnng to rank for MLC Our goal now s to solve the mappng the transformed tranng set T MLC f MLC {( z, y), ( z2, y2),,( z, y n n : Z Y based on z z( x ) Z s an transformed nput vetor whose where elements are meta-level features, and y Y )} s a true ranked lst of ategores for the nput 2. Any learnng-to-rank algorthm ould be deployed n prnple; among the suessful ones, we hoose SVM-MAP n the remander of ths paper. SVM-MAP s orgnally desgned for rankng douments wth respet ad-ho queres where ad-ho means that queres ould be any ombnaton of words, not fxed n advane. The learnng-to-rank problem s to fnd the optmal mappng f IR : Q Y where Q the nput spae (of all possble queres), Y as the output spae (of all possble ranked lsts of douments), and optmal means to mnmze the tranng-set loss as well as the model omplexty. A varety of learnng-to-rank algorthms have been developed reently n mahne learnng for ad-ho retreval, wth dfferent loss funtons and model seleton rtera. SVM-MAP [24] s desgned to maxmze the Mean Average Preson (MAP) of ranked lsts, whh s a ommon metr for method omparson n IR evaluatons. Methods fousng on other optmzaton rtera nlude a multvarate verson of SVM [8] that maxmzes ROC-Area for lassfaton, MCRank and AdaRank [] [9] that use boostng algorthms to maxmze the Normalzed Dsounted Cumulated Gan (NDCG) of ranked lsts, and so on. Most learnng-to-rank algorthms n ad-ho retreval rely on a shared representaton between queres and douments,.e., a bag-of-words per query and per doument. Suh a shared representaton faltates a natural way to ndue features for dsrmnatng the relevane of query-doument pars. For example, SVM-MAP uses a onventonal searh engne (Indr) to produe the osne smlarty and language-model based sores for eah query-doument par, dsretzes those sores nto bns, and treats the bns as the features n a dmenson-redued vetor spae where eah query-doument par wth relevane udgment s treated as a tranng nstane. Optmzng the mappng from queres to ranked lsts of douments therefore redues to the learnng of feature weghts n the dmenson-redued vetor spae. In order to apply SVM-MAP to MLC, or to apply any learnng-to-rank retreval algorthm to MLC n general, we need to fnd dsrmnatve features to represent nstane-ategory pars and the one-to-many mappng from eah nstane to ategores. The meta-level features we ntrodued n the prevous seton are exatly desgned for suh a purpose, allowng a broad range of learnng-to-rank algorthms n IR to be deployed for MLC. The ategory-spef lnks ( 3k 2 per ategory) n vetor z (Seton 2) and the orrespondng label (yes or no wth respet to the ategory) s treated as an nstane-ategory par n the tranng set. We fous 2 There may be more than one true ranked lst for an nstane. That s, any lst that plaes all the relevant ategores above all the rrelevant ategores s truly orret.

4 4 on SVM-MAP n ths paper beause t expltly optmzes MAP whh s one of the prmary metrs we use n our evaluaton of MLC methods (Seton 4.3). SVM-MAP and other learnng-to-rank retreval methods have not been used for MLC before, to our knowledge. We name our novel applaton of SVM-MAP as SVM-MAP-MLC, n ontrast to ts onventonal use n ad-ho nformaton retreval. 3.2 Learnng to threshold for MLC In order to enable the system to make lassfaton desons n MLC, we need to apply a threshold to the ranked lst of ategores for eah test nstane. A varety of thresholdng strateges have been studed n the lterature [2],[5], among whh we hoose the lnear regresson approah proposed n [5]. Unlke bnary-svm where the natural hoe of threshold s zero and probablst lassfers (suh as logst regresson, Naïve Bayes, IBLR, et.) where the default hoe of threshold s 0.5, SVM-MAP-MLC produes nonprobablst sores to rank ategores wth partal-order preferenes. Obvously, nether 0 nor 0.5 s the approprate optmal threshold on a ranked lst of ategores gven an nstane. The strategy proposed n Rank-SVM [5] s desgned for optmzng the threshold ondtoned on eah ranked lst. That s, the system uses a tranng set to learn a lnear mappng from an arbtrary ranked lst of ategores to the optmal threshold as: g : L T m where L R s the spae of all possble vetors of system-sored ategores, and T R s the spae of all possble thresholds. The optmal mappng s defned as the lnear-least-squared-ft (LLSF) soluton gven a tranng set of ranked lsts wth the optmal threshold per lst. The optmal threshold gven a lst s defned as the threshold that mnmzes total error,.e. the sum of type-i errors (the false postves) and type-ii errors (the false negatves). Suh a tranng set an be automatally generated by ) learnng a SVM- MAP-MLC model to sore all ategores ondtoned on eah nput nstane, and 2) fndng the optmal threshold for eah vetor of sored ategores gven an nstane. The LLSF funton s appled n the testng phase, to the system-sored ategores ondtoned on eah test nstane. The resultng threshold s then appled to the orrespondng ranked lst of ategores: the ategores above or at the threshold reeve a yes deson, and the ategores below the threshold reeve a no deson. We modfed the orgnal method by resalng sores of eah nstane to make t n the unt norm. 4. EVALUATION DESIGN 4. Methods for Comparson We onduted ontrolled experments to evaluate SVM-MAP- MLC 3 n omparson wth the followng methods: Bnary-SVM s a standard verson of SVM for one-versus-rest lassfaton, and a ommon baselne n omparatve evaluaton of lassfers (nludng MLC methods) [5] [9] 4. Rank-SVM, the method proposed by [5], s representatve among the model-based methods whh expltly optmze ranked lsts of ategores for MLC 5. IBLR, the nstane-based method reently proposed by [3], s a new method n MLC evaluatons. The method has two versons, one usng knn-based features only (IBLR-ML) and another (IBLR-ML+) usng world-level features n onunton wth knn- based features. We tested both versons and found that IBLR-ML performed onsstently better than IBLR-ML+, whh agrees wth the onluson by the authors of IBLR [3]. We therefore use the results of IBLR-ML for method omparson n the rest of the paper. We used the Weka [7] mplementaton provded by the authors. ML-kNN, the nstane-based method proposed by [25] s another strong baselne n MLC evaluatons [3]. We used the publly avalable Weka mplementaton [7] of ths method n our experments. All the systems produe a ranked lst of ategores gven a test nstane. The ranked lsts an be dretly evaluated usng rank-based metrs, and ndretly evaluated based on bnary lassfaton desons (yes or no on eah ategory) after applyng a threshold to eah ranked lst. The hoe of thresholdng strategy depends on the nature of eah lassfaton method. In Bnary-SVM, for eah ategory, we used the onventonal threshold of zero to obtan a yes/no deson for that ategory. In ML-kNN, IBLR-ML and Rank-SVM we follow the same thresholdng strateges as proposed by the authors. Spefally, for ML-kNN and IBLR we set the threshold to 0.5, and for Rank-SVM we use the lnear least squares ft soluton as the threshold, as proposed n [5]. 4.2 Datasets We used fve datasets 6 n our experments, namely emoton, sene, yeast, Reuters-2578 and Cteseer. These datasets form a representatve sample aross dfferent felds and they vary n tranng-set sze and feature-spae sze. All the datasets exept Cteseer have been used n prevous evaluatons of MLC methods, wth onventonal tran-test splts. We follow suh onventons n order to make our results omparable to the prevously publshed ones. Table summarzes the datasets statsts: Emotons [6] s a mult-label audo dataset, n whh eah nstane s a song, ndexed usng 72 features suh as ampltude, beats per mnute et. The songs have been lassfed under sx moods suh as sad/lonely, relaxng/alm, happy/pleased, amazed/surprsed, angry/aggressve and quet/stll. Sene [] s an mage dataset. The mages are ndexed usng a set of 294 features whh deompose eah mage nto smaller bloks and represent the olor of eah blok. The mages are lassfed based on the senery (Beah, Sunset et) they portray. 3 We used the publly avalable SVM-MAP software at as the ore algorthm. 4 We used the publly avalable SVMlght software pakage at n our experments. 5 We thank the authors of ML-kNN for sharng ther mplementaton of Rank-SVM. 6 The emotons, sene and yeast datasets were obtaned from

5 5 Dataset Name Table. Dataset Charatersts Tranng Sze Testng Sze #Categores Avg #Categores per nstane #Features Emotons Sene Yeast Cteseer Reuters Yeast dataset [5] s a bomedal dataset. Eah nstane s a gene, represented usng a vetor whose features are the mro-array expresson levels under varous ondtons. The genes are lassfed nto 4 dfferent funtonal lasses. Cteseer s a set of researh artles we olleted from the Cteseer web ste 7. Eah artle s ndexed usng the words n ts abstrat as the features, wth a feature-set sze of 4,60. We use the top level of 7 ategores n the Cteseer lassfaton herarhy as the labels n ths dataset, and randomly splt 80% of the orpus nto tranng and the rest as testng nstanes. The dataset wll be made publly avalable along wth the publaton of ths paper. Reuters-2578 s a benhmark dataset n text ategorzaton evaluatons. The nstanes are Reuters news artles durng the perod 987 to 99, and labeled usng 90 topal ategores. We follow the same tran-test splt n [2]. 4.3 Metrs We selet two standard metrs for evaluatng ranked lsts, and two standard metrs for evaluatng lassfaton desons. Mean Average Preson (MAP) [8] s a popular metr n tradtonal IR evaluatons for omparng ranked lsts. It s defned as the average of the per-nstane (or per-ranked-lst) Average preson (AP) over all test nstanes. Let D x,, n be the test set of nstanes, L be the ranked lst of ategores for a spef nstane, r () be the rank of ategory n lst L, and R be the set of ategores relevant to nstane x. MAP s defned as: n MAP ( D) AP( x ) n AP( x ) R R ' R, s. t., r ( ') r ( ) r ( ) Rankng Loss (RankLoss) s a popular metr for omparng MLC methods n rankng ategores [4][3][5][25]. It measures the average number of tmes an rrelevant ategory s ranked above a relevant ategory n a ranked lst as: ( RankLoss( x ), ') R R, s. t. r ( ) r ( ') R R 7 Mro-averaged F s a onventonal metr for evaluatng lassfers n ategory assgnments to test nstanes [0],[22],[23]. The system-made desons on test set D wth respet to a spef ategory C,, m an be dvded nto four groups: True Postves ( TP ), False Postves ( FP ), True Negatves ( TN ) and False Negatves ( FN ), respetvely. The orrespondng evaluaton metrs are defned as: C TP Global Preson P TP FP Global Reall R Mro-averaged C C C TP TP FN F 2 PR P R Maro-averaged F s also a onventonal metr for evaluatng lassfers n ategory assgnments, defned as: Category-spef Preson Category-spef Reall Maro-averaged F TP P TP FP TP R TP FN m C 2P R P R Both mro-averaged F and maro-averaged F are nformatve for method omparson. The former gves the performane on eah nstane an equal weght n omputng the average; the latter gves the performane on eah ategory an equal weght n omputng the average. We hoose the evaluaton measures so as to evaluate the performane of both the rankng algorthms as well as the thresholdng strateges. MAP and RankLoss measure how well a system ranks the ategores; Mro-F and Maro-F evaluate the effetveness of the thresholdng strateges for makng lassfaton desons. 4.4 Expermental Settng Detals All parameters are tuned to optmze MAP and the tunng s done through a fve-fold ross valdaton on the tranng set for eah orpus. The tuned parameters nlude the number of nearest

6 6 neghbors n SVM-MAP-MLC, ML-kNN and IBLR-ML, and the regularzaton parameter n SVM-MAP-MLC, Bnary-SVM, Rank-SVM. For the number of nearest neghbors we tred values from 0 to 00 wth the nrements of 0; for the regularzaton parameter we tred 20 dfferent values between 0 to0. For IBLR, on Cteseer and Reuters-2578, the best hoe for the number of nearest neghbors through ross valdaton was found to be 300 and 90 respetvely. Feature seleton was not performed on any of the datasets for any method. For term weghtng n Cteseer and Reuters douments, we used a onventonal TF-IDF sheme (namely lt ) [22]. On the Emotons dataset, eah feature was resaled to a unt varane representaton sne we observed that the orgnal values of the features are not omparably saled. 5. RESULTS The results of all the systems on the fve datasets are summarzed n Table 2. The methods wth the best sores are hghlghted n bold, and the relatve ranks among the methods on eah dataset n 3 6 a spef metr are provded nsde parentheses. We report the value of -RankLoss nstead of RankLoss ust to make the sores onsstent to eah other,.e., hgher values are better. The total rank of eah system s provded at the bottom lne of the table. 5. Comparatve Analyss Let us all eah lne n Table a ase, whh orresponds to a partular dataset and a spef metr. SVM-MAP-MLC s the best n 8 out of the 20 ases whle Bnary-SVM s the best on the two remanng ases. That s, the proposed approah onsstently outperformed other methods n most ases. Comparng Rank-SVM wth bnary-svm, both are largemargn methods but the former outperformed the latter on 2 out of the 20 ases. These results are onsstent wth the prevously reported evaluaton on the Yeast dataset [5], showng some suess of Rank-SVM by renforng partal-order preferenes among ategores. Comparng Rank-SVM wth SVM-MAP-MLC, on the other hand, the latter outperformed the former n 9 out of the 20 ases although both methods have partal-order preferenes n ther obetve funtons for optmzaton. The advaned Table 2. Results Summary: Eah method s evaluated on fve datasets usng four metrs. The bold-faed numbers ndate the best system on a partular dataset gven the metr; the numbers n parentheses are the ranks of the systems aordngly. Dataset Metr SVM-MAP-MLC ML-kNN Rank-SVM Bnary-SVM IBLR-ML Emotons MAP () (3) (5) (4) (2) -Rankloss () (3) (4) (5) (2) Mro-F () (3) (4) (5) (2) Maro-F () (3) (4) (5) (2) Sene MAP () 0.85 (5) (4) (3) (2) -Rankloss () (5) (2) (3) (4) Mro-F () (3) 0.66 (5) (4) (2) Maro-F () (3) (5) (4) (2) Yeast MAP () (3) (4) (5) (2) -Rankloss () (4) (3) (5) (2) Mro-F () (5) (2) (4) (3) Maro-F () (4) (3) (5) (2) Cteseer MAP () (4) (2) (3) (5) -Rankloss () (3) (2) (5) (4) Mro-F () (4) (2) (3) (5) Maro-F () (3) (2) (4) (5) Reuters MAP (2) (4) (3) () (5) Rankloss () (4) (3) (2) (5) Mro-F (3) (4) (2) () (5) Maro-F 0.545() (5) (3) (2) (4) Rank Total

7 7 Table 3. P-Values n sgned-rank tests for omparng SVM-MAP-MLC wth other methods ( * denotes lower performane of SVM-MAP-MLC). Dataset/Method ML-kNN Rank-SVM Bnary-SVM IBLR-ML Emotons e e Sene 6.975e e-04 Yeast.706e e e Cteseer 9.85e e e e-26 Reuters e * 3.399e-72 performane of SVM-MAP-MLC, evdentally, omes from the use of meta-level features nstead of the onventonal features as that n Rank-SVM. Comparng SVM-MAP-MLC, ML-kNN and IBLR-ML, these methods have one property n ommon: they are ether fully nstane-based or partally nstane-based leveragng knn-based features. IBLR-ML outperformed ML-kNN n 3 out of the 20 ases n our experments; ths s more or less onsstent wth the prevous report by [3] n terms of the relatve performane of the two methods. Nevertheless, both IBLR-ML and ML-kNN underperformed SVM-MAP-MLC n all the 20 ases, showng the advantage of usng of the meta-level features n the learnng-torank framework wth SVM-MAP. Comparng Rank-SVM wth ML-kNN, the former outperformed the latter n 3 out of the 20 ases. The relatve performane of these two methods ompared to eah other s dfferent from the prevously reported evaluaton n [25] where ML-kNN outperformed Rank-SVM on average. In order to larfy ths ssue, we ompared the performane of Rank-SVM wth dfferent values of ts regularzaton parameter whh ontrols the balane between the tranng-set loss and model omplexty. We found Rank-SVM performed suboptmally wth the default parameter settng,, and performed better when ths parameter was tuned through ross valdaton. Our results of Rank-SVM are based on the properly tuned parameters, and ths should explan why Rank-SVM performed stronger than ML-kNN n our experments. Comparng Rank-SVM wth IBLR-ML, eah method outperformed the other n 0 out of the 20 ases. Thus the two methods have omparable performane; both are strong baselnes for method omparson n MLC. 6. SIGNIFICANCE TESTS We used the Wloxon sgned-rank test to ompare the performane of eah method wth that of SVM-MAP-MLC. Sgned-rank s a non-parametr statstal test for parwse omparson of methods, and a better alternatve to the pared t-test when the performane sores are not normally dstrbuted. Due to the spae lmt of the paper, we only present the tests based on the MAP sores of the systems. Eah test nstane s treated as a random event, and the average preson sores of systemgenerated ranked lsts over all test nstanes are used to ompare eah par of systems. The null hypothess s that the system beng ompared wth SVM-MAP-MLC s equally good; the alternatve s that SVM-MAP-MLC s better. The p-values are presented n Table 3 where the null hypothess s reeted wth strong evdene n most ases. That s, the performane dfferene s statstally sgnfant n 7 out of the 20 tests f usng % p-value as the threshold. We dd not use ANOVA tests for mult-set omparson of systems beause suh tests have a normal-dstrbuton assumpton about the data whh s napproprate for the performane sores we use for system omparson. The Fredman test has been reently advoated for omparng lassfers on multple datasets [4] [6], whh does not have the normal assumpton. However, t treats eah dataset as a random event, and requres a relatvely large number of datasets for meanngful testng. It s not reommended to use the Fredman test when the number of datasets s 0 or less Experments wth Feature Subsets In order to analyze the usefulness of dfferent types of meta-level features (lnks), we onduted experments wth the followng ombnatons: usng the sngle lnks n L -norm only, usng the sngle lnks n L2 -norm only, usng the sngle lnks n osne smlarty, and usng all the lnks -- nludng those n L -norm, L2 -norm, osne smlarty and the two average lnks per ategory. Fgure 2 ompares the performane of SVM-MAP-MLC under these ondtons. The dfferent types of lnks have omplementary effets: the sngle lnks n L -norm are more useful n the datasets (Emotons, Sene and Yeast) whose feature- Fgure 2. SVM-MAP-MLC wth dfferent feature sets (Performane n MAP, -RankLoss, Mro-F, Maro-F) 8 Ths s aordng to personal ommunaton wth the author of [4].

8 8 set szes are relatvely small than they are on the datasets (Cteseer and Reuters-2578) whose feature-set szes are large. On the other hand, osne-smlarty based sngle lnks have a dfferent performane pattern; sngle lnks n L2 -norm have omparable performane aross the datasets. Usng all the features together performs the best, showng that SVM-MAP-MLC s able to assgn approprate weghts to dfferent features, and mprove the robustness of ts predtons by makng a ombned use of dfferent features types. 7. CONCLUDING REMARKS In ths paper we produed a new approah for learnng to rank ategores n mult-label lassfaton. By ntrodung meta-level features that effetvely haraterze the one-to-many relatonshp from nstanes to ategores n MLC, and by formulatng the ategory rankng problem as a standard ad-ho retreval problem, our framework allows a broad range of learnng-to-rank retreval algorthms to be deployed for MLC optmzaton wth respet to varous performane metrs. Usng SVM-MAP-MLC as a spef nstantaton of ths framework, and wth ontrolled experments on multple benhmark datasets, the strength of the proposed approah s strongly evdent, as t sgnfantly outperformed all the state-of-the-art methods (Rank-SVM, MLkNN and IBLR-ML) beng evaluated n our experments. We hope ths study provdes useful nsghts nto how to enhane the performane of MLC methods by mprovng the representaton shemes for nstanes, ategores and ther relatonshps, and by reatvely leveragng dmensonalty reduton. A lne of future researh would be to explore our framework wth other learnng-to-rank algorthms, usng dfferent dmensonalty reduton tehnques (suh as SVD or LDA), and for dfferent optmzaton metrs (suh ah NDCG and other types of loss funtons). ACKNOWLEDGEMENTS Ths work s supported, n part, by the Natonal Sene Foundaton (NSF) under grant IIS_ Any opnons, fndngs, onlusons or reommendatons expressed n ths materal are those of the authors and do not neessarly reflet the vews of the sponsors. REFERENCES [] M.R. Boutell, J.Luo, X.Shen and C.M. Brown. Learnng multlabel sene lassfaton. Pattern Reognton 2004, pages [2] Robert H. Creey, Br M. Masand, Stephen J. Smth, Davd L. Waltz: Tradng MIPS and Memory for Knowledge Engneerng. Communatons of the ACM 992, pages (992) [3] W.Cheng and E.Hüllermeer. Combnng Instane-Based Learnng and Logst Regresson for Multlabel Classfaton. Journal of Mahne Learnng Researh 2009, pages [4] J. Demsar. Statstal omparsons for lassfers over multple data sets. Journal of Mahne Learnng Researh 2006, pages -30. [5] A Elsseff and J. Weston. A kernel method for mult-labeled lassfaton. Advanes n Neural Informaton Proessng Systems 2002, pages [6] S Garıa and F. Herrera. An extenson on statstal omparsons of lassfers over multple data sets for all parwse omparsons. Journal of Mahne Learnng Researh 2008, pages [7] M. Hall, E. Frank, G. Holmes, B. Pfahrnger, P. Reutemann and I.H. Wtten; The WEKA Data Mnng Software: An Update. SIGKDD Exploratons, Volume, [8] T. Joahms: A support vetor method for multvarate performane measures. Internatonal Conferene on Mahne Learnng 2005, pages [9] T. Joahms, Makng large-sale SVM Learnng Pratal. Advanes n Kernel Methods - Support Vetor Learnng, MIT- Press, 999. [0] D.D. Lews, R.E. Shapre, J.P. Callan and R.Papka. Tranng algorthms for lnear text lassfers. ACM SIGIR 996, pages [] F. L and Y. Yang. A loss funton analyss for lassfaton methods n text ategorzaton. Internatonal Conferene on Mahne Learnng 2003, pages [2] P L, C Burges, Q Wu, JC Platt, D Koller, Y Snger and S Rowes. MRank: Learnng to rank usng multple lassfaton and gradent boostng. Advanes n Neural Informaton Proessng Systems [3] S Har-Peled, D Roth and D Zmak. Constrant lassfaton: a new approah to multlass lassfaton and rankng. Advanes n Neural Informaton Proessng Systems 2002, pages [4] R.E. Shapre and Y. Snger. BoosTexter: A boostng-based system for text ategorzaton. Journal of Mahne learnng Researh 2000, pages [5] R.E. Shapre and Y. Snger. Improved boostng algorthms usng onfdene-rated predtons. Journal of Mahne learnng Researh 999, pages [6] K. Trohds,G. Tsoumakas, G. Kallrs, I. Vlahavas. Multlabel lassfaton of mus nto emotons Internatonal Conferene on Mus Informaton Retreval [7] V. Vapnk, The nature of statstal learnng theory, Sprnger verlag, New York, [8] E.Voorhees Overvew of TREC 2002, NIST Speal Publaton SP [9] Jun Xu and Hang L. AdaRank: a boostng algorthm for nformaton retreval. ACM SIGIR 2007, pages [20] Y. Yang: Expert Network: Effetve and Effent Learnng from Human Desons n Text Categorzaton and Retreval. ACM SIGIR 994, pages [2] Y. Yang. A Study of thresholdng strateges for text ategorzaton. ACM SIGIR 200, pages [22] Y. Yang. An evaluaton of statstal approahes for text lassfaton. Informaton Retreval 999. Vol, No. ½, pages [23] Y. Yang and J.O. Pederson. A omparatve study of features seleton n text ategorzaton. Internatonal Conferene on Mahne Learnng 997, pages [24] Ysong Yue, T. Fnley, F. Radlnsk and T. Joahms. A Support Vetor Method for Optmzng Average Preson. ACM SIGIR Conferene [25] ML Zhang and ZH Zhou. ML-kNN: A lazy learnng approah to mult-label learnng. Pattern Reognton 2007, pages

Color Texture Classification using Modified Local Binary Patterns based on Intensity and Color Information

Color Texture Classification using Modified Local Binary Patterns based on Intensity and Color Information Color Texture Classfaton usng Modfed Loal Bnary Patterns based on Intensty and Color Informaton Shvashankar S. Department of Computer Sene Karnatak Unversty, Dharwad-580003 Karnataka,Inda shvashankars@kud.a.n

More information

Outline. Discriminative classifiers for image recognition. Where in the World? A nearest neighbor recognition example 4/14/2011. CS 376 Lecture 22 1

Outline. Discriminative classifiers for image recognition. Where in the World? A nearest neighbor recognition example 4/14/2011. CS 376 Lecture 22 1 4/14/011 Outlne Dscrmnatve classfers for mage recognton Wednesday, Aprl 13 Krsten Grauman UT-Austn Last tme: wndow-based generc obect detecton basc ppelne face detecton wth boostng as case study Today:

More information

LOCAL BINARY PATTERNS AND ITS VARIANTS FOR FACE RECOGNITION

LOCAL BINARY PATTERNS AND ITS VARIANTS FOR FACE RECOGNITION IEEE-Internatonal Conferene on Reent Trends n Informaton Tehnology, ICRTIT 211 MIT, Anna Unversty, Chenna. June 3-5, 211 LOCAL BINARY PATTERNS AND ITS VARIANTS FOR FACE RECOGNITION K.Meena #1, Dr.A.Suruland

More information

Boosting Weighted Linear Discriminant Analysis

Boosting Weighted Linear Discriminant Analysis . Okada et al. / Internatonal Journal of Advaned Statsts and I&C for Eonoms and Lfe Senes Boostng Weghted Lnear Dsrmnant Analyss azunor Okada, Arturo Flores 2, Marus George Lnguraru 3 Computer Sene Department,

More information

Performance Evaluation of TreeQ and LVQ Classifiers for Music Information Retrieval

Performance Evaluation of TreeQ and LVQ Classifiers for Music Information Retrieval Performane Evaluaton of TreeQ and LVQ Classfers for Mus Informaton Retreval Matna Charam, Ram Halloush, Sofa Tsekerdou Athens Informaton Tehnology (AIT) 0.8 km Markopoulo Ave. GR - 19002 Peana, Athens,

More information

Support Vector Machines

Support Vector Machines /9/207 MIST.6060 Busness Intellgence and Data Mnng What are Support Vector Machnes? Support Vector Machnes Support Vector Machnes (SVMs) are supervsed learnng technques that analyze data and recognze patterns.

More information

Performance Analysis of Hybrid (supervised and unsupervised) method for multiclass data set

Performance Analysis of Hybrid (supervised and unsupervised) method for multiclass data set IOSR Journal of Computer Engneerng (IOSR-JCE) e-issn: 2278-0661,p-ISSN: 2278-8727, Volume 16, Issue 4, Ver. III (Jul Aug. 2014), PP 93-99 www.osrjournals.org Performane Analyss of Hybrd (supervsed and

More information

A Fast Way to Produce Optimal Fixed-Depth Decision Trees

A Fast Way to Produce Optimal Fixed-Depth Decision Trees A Fast Way to Produe Optmal Fxed-Depth Deson Trees Alreza Farhangfar, Russell Grener and Martn Znkevh Dept of Computng Sene Unversty of Alberta Edmonton, Alberta T6G 2E8 Canada {farhang, grener, maz}@s.ualberta.a

More information

Pattern Classification: An Improvement Using Combination of VQ and PCA Based Techniques

Pattern Classification: An Improvement Using Combination of VQ and PCA Based Techniques Ameran Journal of Appled Senes (0): 445-455, 005 ISSN 546-939 005 Sene Publatons Pattern Classfaton: An Improvement Usng Combnaton of VQ and PCA Based Tehnques Alok Sharma, Kuldp K. Palwal and Godfrey

More information

Link Graph Analysis for Adult Images Classification

Link Graph Analysis for Adult Images Classification Lnk Graph Analyss for Adult Images Classfaton Evgeny Khartonov Insttute of Physs and Tehnology, Yandex LLC 90, 6 Lev Tolstoy st., khartonov@yandex-team.ru Anton Slesarev Insttute of Physs and Tehnology,

More information

Classifier Selection Based on Data Complexity Measures *

Classifier Selection Based on Data Complexity Measures * Classfer Selecton Based on Data Complexty Measures * Edth Hernández-Reyes, J.A. Carrasco-Ochoa, and J.Fco. Martínez-Trndad Natonal Insttute for Astrophyscs, Optcs and Electroncs, Lus Enrque Erro No.1 Sta.

More information

Matrix-Matrix Multiplication Using Systolic Array Architecture in Bluespec

Matrix-Matrix Multiplication Using Systolic Array Architecture in Bluespec Matrx-Matrx Multplaton Usng Systol Array Arhteture n Bluespe Team SegFault Chatanya Peddawad (EEB096), Aman Goel (EEB087), heera B (EEB090) Ot. 25, 205 Theoretal Bakground. Matrx-Matrx Multplaton on Hardware

More information

Gabor-Filtering-Based Completed Local Binary Patterns for Land-Use Scene Classification

Gabor-Filtering-Based Completed Local Binary Patterns for Land-Use Scene Classification Gabor-Flterng-Based Completed Loal Bnary Patterns for Land-Use Sene Classfaton Chen Chen 1, Lbng Zhou 2,*, Janzhong Guo 1,2, We L 3, Hongjun Su 4, Fangda Guo 5 1 Department of Eletral Engneerng, Unversty

More information

12/2/2009. Announcements. Parametric / Non-parametric. Case-Based Reasoning. Nearest-Neighbor on Images. Nearest-Neighbor Classification

12/2/2009. Announcements. Parametric / Non-parametric. Case-Based Reasoning. Nearest-Neighbor on Images. Nearest-Neighbor Classification Introducton to Artfcal Intellgence V22.0472-001 Fall 2009 Lecture 24: Nearest-Neghbors & Support Vector Machnes Rob Fergus Dept of Computer Scence, Courant Insttute, NYU Sldes from Danel Yeung, John DeNero

More information

Discriminative Dictionary Learning with Pairwise Constraints

Discriminative Dictionary Learning with Pairwise Constraints Dscrmnatve Dctonary Learnng wth Parwse Constrants Humn Guo Zhuoln Jang LARRY S. DAVIS UNIVERSITY OF MARYLAND Nov. 6 th, Outlne Introducton/motvaton Dctonary Learnng Dscrmnatve Dctonary Learnng wth Parwse

More information

Steganalysis of DCT-Embedding Based Adaptive Steganography and YASS

Steganalysis of DCT-Embedding Based Adaptive Steganography and YASS Steganalyss of DCT-Embeddng Based Adaptve Steganography and YASS Qngzhong Lu Department of Computer Sene Sam Houston State Unversty Huntsvlle, TX 77341, U.S.A. lu@shsu.edu ABSTRACT Reently well-desgned

More information

Fuzzy Modeling for Multi-Label Text Classification Supported by Classification Algorithms

Fuzzy Modeling for Multi-Label Text Classification Supported by Classification Algorithms Journal of Computer Senes Orgnal Researh Paper Fuzzy Modelng for Mult-Label Text Classfaton Supported by Classfaton Algorthms 1 Beatrz Wlges, 2 Gustavo Mateus, 2 Slva Nassar, 2 Renato Cslagh and 3 Rogéro

More information

Feature Reduction and Selection

Feature Reduction and Selection Feature Reducton and Selecton Dr. Shuang LIANG School of Software Engneerng TongJ Unversty Fall, 2012 Today s Topcs Introducton Problems of Dmensonalty Feature Reducton Statstc methods Prncpal Components

More information

Clustering Data. Clustering Methods. The clustering problem: Given a set of objects, find groups of similar objects

Clustering Data. Clustering Methods. The clustering problem: Given a set of objects, find groups of similar objects Clusterng Data The lusterng problem: Gven a set of obets, fnd groups of smlar obets Cluster: a olleton of data obets Smlar to one another wthn the same luster Dssmlar to the obets n other lusters What

More information

Cluster ( Vehicle Example. Cluster analysis ( Terminology. Vehicle Clusters. Why cluster?

Cluster (  Vehicle Example. Cluster analysis (  Terminology. Vehicle Clusters. Why cluster? Why luster? referene funton R R Although R and R both somewhat orrelated wth the referene funton, they are unorrelated wth eah other Cluster (www.m-w.om) A number of smlar ndvduals that our together as

More information

Smoothing Spline ANOVA for variable screening

Smoothing Spline ANOVA for variable screening Smoothng Splne ANOVA for varable screenng a useful tool for metamodels tranng and mult-objectve optmzaton L. Rcco, E. Rgon, A. Turco Outlne RSM Introducton Possble couplng Test case MOO MOO wth Game Theory

More information

Adaptive Class Preserving Representation for Image Classification

Adaptive Class Preserving Representation for Image Classification Adaptve Class Preservng Representaton for Image Classfaton Jan-Xun M,, Qankun Fu,, Wesheng L, Chongqng Key Laboratory of Computatonal Intellgene, Chongqng Unversty of Posts and eleommunatons, Chongqng,

More information

Bit-level Arithmetic Optimization for Carry-Save Additions

Bit-level Arithmetic Optimization for Carry-Save Additions Bt-leel Arthmet Optmzaton for Carry-Sae s Ke-Yong Khoo, Zhan Yu and Alan N. Wllson, Jr. Integrated Cruts and Systems Laboratory Unersty of Calforna, Los Angeles, CA 995 khoo, zhanyu, wllson @sl.ula.edu

More information

Performance Evaluation of Information Retrieval Systems

Performance Evaluation of Information Retrieval Systems Why System Evaluaton? Performance Evaluaton of Informaton Retreval Systems Many sldes n ths secton are adapted from Prof. Joydeep Ghosh (UT ECE) who n turn adapted them from Prof. Dk Lee (Unv. of Scence

More information

Connectivity in Fuzzy Soft graph and its Complement

Connectivity in Fuzzy Soft graph and its Complement IOSR Journal of Mathemats (IOSR-JM) e-issn: 2278-5728, p-issn: 2319-765X. Volume 1 Issue 5 Ver. IV (Sep. - Ot.2016), PP 95-99 www.osrjournals.org Connetvty n Fuzzy Soft graph and ts Complement Shashkala

More information

y and the total sum of

y and the total sum of Lnear regresson Testng for non-lnearty In analytcal chemstry, lnear regresson s commonly used n the constructon of calbraton functons requred for analytcal technques such as gas chromatography, atomc absorpton

More information

Research on Neural Network Model Based on Subtraction Clustering and Its Applications

Research on Neural Network Model Based on Subtraction Clustering and Its Applications Avalable onlne at www.senedret.om Physs Proeda 5 (01 ) 164 1647 01 Internatonal Conferene on Sold State Deves and Materals Sene Researh on Neural Networ Model Based on Subtraton Clusterng and Its Applatons

More information

Session 4.2. Switching planning. Switching/Routing planning

Session 4.2. Switching planning. Switching/Routing planning ITU Semnar Warsaw Poland 6-0 Otober 2003 Sesson 4.2 Swthng/Routng plannng Network Plannng Strategy for evolvng Network Arhtetures Sesson 4.2- Swthng plannng Loaton problem : Optmal plaement of exhanges

More information

Discriminative classifiers for object classification. Last time

Discriminative classifiers for object classification. Last time Dscrmnatve classfers for object classfcaton Thursday, Nov 12 Krsten Grauman UT Austn Last tme Supervsed classfcaton Loss and rsk, kbayes rule Skn color detecton example Sldng ndo detecton Classfers, boostng

More information

FUZZY SEGMENTATION IN IMAGE PROCESSING

FUZZY SEGMENTATION IN IMAGE PROCESSING FUZZY SEGMENTATION IN IMAGE PROESSING uevas J. Er,, Zaldívar N. Danel,, Roas Raúl Free Unverstät Berln, Insttut für Inforat Tausstr. 9, D-495 Berln, Gerany. Tel. 0049-030-8385485, Fax. 0049-030-8387509

More information

Machine Learning. Support Vector Machines. (contains material adapted from talks by Constantin F. Aliferis & Ioannis Tsamardinos, and Martin Law)

Machine Learning. Support Vector Machines. (contains material adapted from talks by Constantin F. Aliferis & Ioannis Tsamardinos, and Martin Law) Machne Learnng Support Vector Machnes (contans materal adapted from talks by Constantn F. Alfers & Ioanns Tsamardnos, and Martn Law) Bryan Pardo, Machne Learnng: EECS 349 Fall 2014 Support Vector Machnes

More information

A MPAA-Based Iterative Clustering Algorithm Augmented by Nearest Neighbors Search for Time-Series Data Streams

A MPAA-Based Iterative Clustering Algorithm Augmented by Nearest Neighbors Search for Time-Series Data Streams A MPAA-Based Iteratve Clusterng Algorthm Augmented by Nearest Neghbors Searh for Tme-Seres Data Streams Jessa Ln 1, Mha Vlahos 1, Eamonn Keogh 1, Dmtros Gunopulos 1, Janwe Lu 2, Shouan Yu 2, and Jan Le

More information

TAR based shape features in unconstrained handwritten digit recognition

TAR based shape features in unconstrained handwritten digit recognition TAR based shape features n unonstraned handwrtten dgt reognton P. AHAMED AND YOUSEF AL-OHALI Department of Computer Sene Kng Saud Unversty P.O.B. 578, Ryadh 543 SAUDI ARABIA shamapervez@gmal.om, yousef@s.edu.sa

More information

Pairwise Identity Verification via Linear Concentrative Metric Learning

Pairwise Identity Verification via Linear Concentrative Metric Learning Parwse Identty Verfaton va Lnear Conentratve Metr Learnng Lle Zheng, Stefan Duffner, Khald Idrss, Chrstophe Gara, Atlla Baskurt To te ths verson: Lle Zheng, Stefan Duffner, Khald Idrss, Chrstophe Gara,

More information

USING GRAPHING SKILLS

USING GRAPHING SKILLS Name: BOLOGY: Date: _ Class: USNG GRAPHNG SKLLS NTRODUCTON: Recorded data can be plotted on a graph. A graph s a pctoral representaton of nformaton recorded n a data table. t s used to show a relatonshp

More information

CS434a/541a: Pattern Recognition Prof. Olga Veksler. Lecture 15

CS434a/541a: Pattern Recognition Prof. Olga Veksler. Lecture 15 CS434a/541a: Pattern Recognton Prof. Olga Veksler Lecture 15 Today New Topc: Unsupervsed Learnng Supervsed vs. unsupervsed learnng Unsupervsed learnng Net Tme: parametrc unsupervsed learnng Today: nonparametrc

More information

Support Vector Machines

Support Vector Machines Support Vector Machnes Decson surface s a hyperplane (lne n 2D) n feature space (smlar to the Perceptron) Arguably, the most mportant recent dscovery n machne learnng In a nutshell: map the data to a predetermned

More information

Bottom-Up Fuzzy Partitioning in Fuzzy Decision Trees

Bottom-Up Fuzzy Partitioning in Fuzzy Decision Trees Bottom-Up Fuzzy arttonng n Fuzzy eson Trees Maej Fajfer ept. of Mathemats and Computer Sene Unversty of Mssour St. Lous St. Lous, Mssour 63121 maejf@me.pl Cezary Z. Janow ept. of Mathemats and Computer

More information

Lecture 4: Principal components

Lecture 4: Principal components /3/6 Lecture 4: Prncpal components 3..6 Multvarate lnear regresson MLR s optmal for the estmaton data...but poor for handlng collnear data Covarance matrx s not nvertble (large condton number) Robustness

More information

Optimizing Document Scoring for Query Retrieval

Optimizing Document Scoring for Query Retrieval Optmzng Document Scorng for Query Retreval Brent Ellwen baellwe@cs.stanford.edu Abstract The goal of ths project was to automate the process of tunng a document query engne. Specfcally, I used machne learnng

More information

EYE CENTER LOCALIZATION ON A FACIAL IMAGE BASED ON MULTI-BLOCK LOCAL BINARY PATTERNS

EYE CENTER LOCALIZATION ON A FACIAL IMAGE BASED ON MULTI-BLOCK LOCAL BINARY PATTERNS P.G. Demdov Yaroslavl State Unversty Anatoly Ntn, Vladmr Khryashchev, Olga Stepanova, Igor Kostern EYE CENTER LOCALIZATION ON A FACIAL IMAGE BASED ON MULTI-BLOCK LOCAL BINARY PATTERNS Yaroslavl, 2015 Eye

More information

The Research of Support Vector Machine in Agricultural Data Classification

The Research of Support Vector Machine in Agricultural Data Classification The Research of Support Vector Machne n Agrcultural Data Classfcaton Le Sh, Qguo Duan, Xnmng Ma, Me Weng College of Informaton and Management Scence, HeNan Agrcultural Unversty, Zhengzhou 45000 Chna Zhengzhou

More information

Machine Learning 9. week

Machine Learning 9. week Machne Learnng 9. week Mappng Concept Radal Bass Functons (RBF) RBF Networks 1 Mappng It s probably the best scenaro for the classfcaton of two dataset s to separate them lnearly. As you see n the below

More information

Subspace clustering. Clustering. Fundamental to all clustering techniques is the choice of distance measure between data points;

Subspace clustering. Clustering. Fundamental to all clustering techniques is the choice of distance measure between data points; Subspace clusterng Clusterng Fundamental to all clusterng technques s the choce of dstance measure between data ponts; D q ( ) ( ) 2 x x = x x, j k = 1 k jk Squared Eucldean dstance Assumpton: All features

More information

arxiv: v3 [cs.cv] 31 Oct 2016

arxiv: v3 [cs.cv] 31 Oct 2016 Unversal Correspondene Network Chrstopher B. Choy Stanford Unversty hrshoy@a.stanford.edu JunYoung Gwak Stanford Unversty jgwak@a.stanford.edu Slvo Savarese Stanford Unversty sslvo@stanford.edu arxv:1606.03558v3

More information

Pixel-Based Texture Classification of Tissues in Computed Tomography

Pixel-Based Texture Classification of Tissues in Computed Tomography Pxel-Based Texture Classfaton of Tssues n Computed Tomography Ruhaneewan Susomboon, Danela Stan Rau, Jaob Furst Intellgent ultmeda Proessng Laboratory Shool of Computer Sene, Teleommunatons, and Informaton

More information

BOOSTING CLASSIFICATION ACCURACY WITH SAMPLES CHOSEN FROM A VALIDATION SET

BOOSTING CLASSIFICATION ACCURACY WITH SAMPLES CHOSEN FROM A VALIDATION SET 1 BOOSTING CLASSIFICATION ACCURACY WITH SAMPLES CHOSEN FROM A VALIDATION SET TZU-CHENG CHUANG School of Electrcal and Computer Engneerng, Purdue Unversty, West Lafayette, Indana 47907 SAUL B. GELFAND School

More information

Lecture 5: Multilayer Perceptrons

Lecture 5: Multilayer Perceptrons Lecture 5: Multlayer Perceptrons Roger Grosse 1 Introducton So far, we ve only talked about lnear models: lnear regresson and lnear bnary classfers. We noted that there are functons that can t be represented

More information

LECTURE : MANIFOLD LEARNING

LECTURE : MANIFOLD LEARNING LECTURE : MANIFOLD LEARNING Rta Osadchy Some sldes are due to L.Saul, V. C. Raykar, N. Verma Topcs PCA MDS IsoMap LLE EgenMaps Done! Dmensonalty Reducton Data representaton Inputs are real-valued vectors

More information

Learning the Kernel Parameters in Kernel Minimum Distance Classifier

Learning the Kernel Parameters in Kernel Minimum Distance Classifier Learnng the Kernel Parameters n Kernel Mnmum Dstance Classfer Daoqang Zhang 1,, Songcan Chen and Zh-Hua Zhou 1* 1 Natonal Laboratory for Novel Software Technology Nanjng Unversty, Nanjng 193, Chna Department

More information

Data Mining: Model Evaluation

Data Mining: Model Evaluation Data Mnng: Model Evaluaton Aprl 16, 2013 1 Issues: Evaluatng Classfcaton Methods Accurac classfer accurac: predctng class label predctor accurac: guessng value of predcted attrbutes Speed tme to construct

More information

Outline. Type of Machine Learning. Examples of Application. Unsupervised Learning

Outline. Type of Machine Learning. Examples of Application. Unsupervised Learning Outlne Artfcal Intellgence and ts applcatons Lecture 8 Unsupervsed Learnng Professor Danel Yeung danyeung@eee.org Dr. Patrck Chan patrckchan@eee.org South Chna Unversty of Technology, Chna Introducton

More information

Collaboratively Regularized Nearest Points for Set Based Recognition

Collaboratively Regularized Nearest Points for Set Based Recognition Academc Center for Computng and Meda Studes, Kyoto Unversty Collaboratvely Regularzed Nearest Ponts for Set Based Recognton Yang Wu, Mchhko Mnoh, Masayuk Mukunok Kyoto Unversty 9/1/013 BMVC 013 @ Brstol,

More information

Private Information Retrieval (PIR)

Private Information Retrieval (PIR) 2 Levente Buttyán Problem formulaton Alce wants to obtan nformaton from a database, but she does not want the database to learn whch nformaton she wanted e.g., Alce s an nvestor queryng a stock-market

More information

Multi-scale and Discriminative Part Detectors Based Features for Multi-label Image Classification

Multi-scale and Discriminative Part Detectors Based Features for Multi-label Image Classification Proeedngs of the wenty-seventh Internatonal Jont Conferene on Artfal Intellgene (IJCAI-8) Mult-sale and Dsrmnatve Part Detetors Based Features for Mult-lael Image Classfaton Gong Cheng, Deheng Gao, Yang

More information

Optimal shape and location of piezoelectric materials for topology optimization of flextensional actuators

Optimal shape and location of piezoelectric materials for topology optimization of flextensional actuators Optmal shape and loaton of pezoeletr materals for topology optmzaton of flextensonal atuators ng L 1 Xueme Xn 2 Noboru Kkuh 1 Kazuhro Satou 1 1 Department of Mehanal Engneerng, Unversty of Mhgan, Ann Arbor,

More information

The Greedy Method. Outline and Reading. Change Money Problem. Greedy Algorithms. Applications of the Greedy Strategy. The Greedy Method Technique

The Greedy Method. Outline and Reading. Change Money Problem. Greedy Algorithms. Applications of the Greedy Strategy. The Greedy Method Technique //00 :0 AM Outlne and Readng The Greedy Method The Greedy Method Technque (secton.) Fractonal Knapsack Problem (secton..) Task Schedulng (secton..) Mnmum Spannng Trees (secton.) Change Money Problem Greedy

More information

Avatar Face Recognition using Wavelet Transform and Hierarchical Multi-scale LBP

Avatar Face Recognition using Wavelet Transform and Hierarchical Multi-scale LBP 2011 10th Internatonal Conferene on Mahne Learnng and Applatons Avatar Fae Reognton usng Wavelet Transform and Herarhal Mult-sale LBP Abdallah A. Mohamed, Darryl D Souza, Naouel Bal and Roman V. Yampolsky

More information

Semi-analytic Evaluation of Quality of Service Parameters in Multihop Networks

Semi-analytic Evaluation of Quality of Service Parameters in Multihop Networks U J.T. (4): -4 (pr. 8) Sem-analyt Evaluaton of Qualty of Serve arameters n Multhop etworks Dobr tanassov Batovsk Faulty of Sene and Tehnology, ssumpton Unversty, Bangkok, Thaland bstrat

More information

A Unified Framework for Semantics and Feature Based Relevance Feedback in Image Retrieval Systems

A Unified Framework for Semantics and Feature Based Relevance Feedback in Image Retrieval Systems A Unfed Framework for Semantcs and Feature Based Relevance Feedback n Image Retreval Systems Ye Lu *, Chunhu Hu 2, Xngquan Zhu 3*, HongJang Zhang 2, Qang Yang * School of Computng Scence Smon Fraser Unversty

More information

S1 Note. Basis functions.

S1 Note. Basis functions. S1 Note. Bass functons. Contents Types of bass functons...1 The Fourer bass...2 B-splne bass...3 Power and type I error rates wth dfferent numbers of bass functons...4 Table S1. Smulaton results of type

More information

Term Weighting Classification System Using the Chi-square Statistic for the Classification Subtask at NTCIR-6 Patent Retrieval Task

Term Weighting Classification System Using the Chi-square Statistic for the Classification Subtask at NTCIR-6 Patent Retrieval Task Proceedngs of NTCIR-6 Workshop Meetng, May 15-18, 2007, Tokyo, Japan Term Weghtng Classfcaton System Usng the Ch-square Statstc for the Classfcaton Subtask at NTCIR-6 Patent Retreval Task Kotaro Hashmoto

More information

CS246: Mining Massive Datasets Jure Leskovec, Stanford University

CS246: Mining Massive Datasets Jure Leskovec, Stanford University CS46: Mnng Massve Datasets Jure Leskovec, Stanford Unversty http://cs46.stanford.edu /19/013 Jure Leskovec, Stanford CS46: Mnng Massve Datasets, http://cs46.stanford.edu Perceptron: y = sgn( x Ho to fnd

More information

Scalable Parametric Runtime Monitoring

Scalable Parametric Runtime Monitoring Salable Parametr Runtme Montorng Dongyun Jn Patrk O Nel Meredth Grgore Roşu Department of Computer Sene Unversty of Illnos at Urbana Champagn Urbana, IL, U.S.A. {djn3, pmeredt, grosu}@s.llnos.edu Abstrat

More information

K-means and Hierarchical Clustering

K-means and Hierarchical Clustering Note to other teachers and users of these sldes. Andrew would be delghted f you found ths source materal useful n gvng your own lectures. Feel free to use these sldes verbatm, or to modfy them to ft your

More information

Parallelism for Nested Loops with Non-uniform and Flow Dependences

Parallelism for Nested Loops with Non-uniform and Flow Dependences Parallelsm for Nested Loops wth Non-unform and Flow Dependences Sam-Jn Jeong Dept. of Informaton & Communcaton Engneerng, Cheonan Unversty, 5, Anseo-dong, Cheonan, Chungnam, 330-80, Korea. seong@cheonan.ac.kr

More information

Progressive scan conversion based on edge-dependent interpolation using fuzzy logic

Progressive scan conversion based on edge-dependent interpolation using fuzzy logic Progressve san onverson based on edge-dependent nterpolaton usng fuzzy log P. Brox brox@mse.nm.es I. Baturone lum@mse.nm.es Insttuto de Mroeletróna de Sevlla, Centro Naonal de Mroeletróna Avda. Rena Meredes

More information

Experiments in Text Categorization Using Term Selection by Distance to Transition Point

Experiments in Text Categorization Using Term Selection by Distance to Transition Point Experments n Text Categorzaton Usng Term Selecton by Dstance to Transton Pont Edgar Moyotl-Hernández, Héctor Jménez-Salazar Facultad de Cencas de la Computacón, B. Unversdad Autónoma de Puebla, 14 Sur

More information

Elsevier Editorial System(tm) for NeuroImage Manuscript Draft

Elsevier Editorial System(tm) for NeuroImage Manuscript Draft Elsever Edtoral System(tm) for NeuroImage Manusrpt Draft Manusrpt Number: Ttle: Comparson of ampltude normalzaton strateges on the auray and relablty of group ICA deompostons Artle Type: Tehnal Note Seton/Category:

More information

Computing Cloud Cover Fraction in Satellite Images using Deep Extreme Learning Machine

Computing Cloud Cover Fraction in Satellite Images using Deep Extreme Learning Machine Computng Cloud Cover Fraton n Satellte Images usng Deep Extreme Learnng Mahne L-guo WENG, We-bn KONG, Mn XIA College of Informaton and Control, Nanjng Unversty of Informaton Sene & Tehnology, Nanjng Jangsu

More information

The Simulation of Electromagnetic Suspension System Based on the Finite Element Analysis

The Simulation of Electromagnetic Suspension System Based on the Finite Element Analysis 308 JOURNAL OF COMPUTERS, VOL. 8, NO., FEBRUARY 03 The Smulaton of Suspenson System Based on the Fnte Element Analyss Zhengfeng Mng Shool of Eletron & Mahanal Engneerng, Xdan Unversty, X an, Chna Emal:

More information

Problem Definitions and Evaluation Criteria for Computational Expensive Optimization

Problem Definitions and Evaluation Criteria for Computational Expensive Optimization Problem efntons and Evaluaton Crtera for Computatonal Expensve Optmzaton B. Lu 1, Q. Chen and Q. Zhang 3, J. J. Lang 4, P. N. Suganthan, B. Y. Qu 6 1 epartment of Computng, Glyndwr Unversty, UK Faclty

More information

A Weighted Method to Improve the Centroid-based Classifier

A Weighted Method to Improve the Centroid-based Classifier 016 Internatonal onference on Electrcal Engneerng and utomaton (IEE 016) ISN: 978-1-60595-407-3 Weghted ethod to Improve the entrod-based lassfer huan LIU, Wen-yong WNG *, Guang-hu TU, Nan-nan LIU and

More information

Interval uncertain optimization of structures using Chebyshev meta-models

Interval uncertain optimization of structures using Chebyshev meta-models 0 th World Congress on Strutural and Multdsplnary Optmzaton May 9-24, 203, Orlando, Florda, USA Interval unertan optmzaton of strutures usng Chebyshev meta-models Jngla Wu, Zhen Luo, Nong Zhang (Tmes New

More information

Support Vector Machines. CS534 - Machine Learning

Support Vector Machines. CS534 - Machine Learning Support Vector Machnes CS534 - Machne Learnng Perceptron Revsted: Lnear Separators Bnar classfcaton can be veed as the task of separatng classes n feature space: b > 0 b 0 b < 0 f() sgn( b) Lnear Separators

More information

Tsinghua University at TAC 2009: Summarizing Multi-documents by Information Distance

Tsinghua University at TAC 2009: Summarizing Multi-documents by Information Distance Tsnghua Unversty at TAC 2009: Summarzng Mult-documents by Informaton Dstance Chong Long, Mnle Huang, Xaoyan Zhu State Key Laboratory of Intellgent Technology and Systems, Tsnghua Natonal Laboratory for

More information

Classification / Regression Support Vector Machines

Classification / Regression Support Vector Machines Classfcaton / Regresson Support Vector Machnes Jeff Howbert Introducton to Machne Learnng Wnter 04 Topcs SVM classfers for lnearly separable classes SVM classfers for non-lnearly separable classes SVM

More information

Pruning Training Corpus to Speedup Text Classification 1

Pruning Training Corpus to Speedup Text Classification 1 Prunng Tranng Corpus to Speedup Text Classfcaton Jhong Guan and Shugeng Zhou School of Computer Scence, Wuhan Unversty, Wuhan, 430079, Chna hguan@wtusm.edu.cn State Key Lab of Software Engneerng, Wuhan

More information

Edge Detection in Noisy Images Using the Support Vector Machines

Edge Detection in Noisy Images Using the Support Vector Machines Edge Detecton n Nosy Images Usng the Support Vector Machnes Hlaro Gómez-Moreno, Saturnno Maldonado-Bascón, Francsco López-Ferreras Sgnal Theory and Communcatons Department. Unversty of Alcalá Crta. Madrd-Barcelona

More information

A Robust Algorithm for Text Detection in Color Images

A Robust Algorithm for Text Detection in Color Images A Robust Algorthm for Tet Deteton n Color Images Yangng LIU Satosh GOTO Takesh IKENAGA Abstrat Tet deteton n olor mages has beome an atve researh area sne reent deades. In ths paper we present a novel

More information

An Optimal Algorithm for Prufer Codes *

An Optimal Algorithm for Prufer Codes * J. Software Engneerng & Applcatons, 2009, 2: 111-115 do:10.4236/jsea.2009.22016 Publshed Onlne July 2009 (www.scrp.org/journal/jsea) An Optmal Algorthm for Prufer Codes * Xaodong Wang 1, 2, Le Wang 3,

More information

Compiler Design. Spring Register Allocation. Sample Exercises and Solutions. Prof. Pedro C. Diniz

Compiler Design. Spring Register Allocation. Sample Exercises and Solutions. Prof. Pedro C. Diniz Compler Desgn Sprng 2014 Regster Allocaton Sample Exercses and Solutons Prof. Pedro C. Dnz USC / Informaton Scences Insttute 4676 Admralty Way, Sute 1001 Marna del Rey, Calforna 90292 pedro@s.edu Regster

More information

Some Advanced SPC Tools 1. Cumulative Sum Control (Cusum) Chart For the data shown in Table 9-1, the x chart can be generated.

Some Advanced SPC Tools 1. Cumulative Sum Control (Cusum) Chart For the data shown in Table 9-1, the x chart can be generated. Some Advanced SP Tools 1. umulatve Sum ontrol (usum) hart For the data shown n Table 9-1, the x chart can be generated. However, the shft taken place at sample #21 s not apparent. 92 For ths set samples,

More information

Kent State University CS 4/ Design and Analysis of Algorithms. Dept. of Math & Computer Science LECT-16. Dynamic Programming

Kent State University CS 4/ Design and Analysis of Algorithms. Dept. of Math & Computer Science LECT-16. Dynamic Programming CS 4/560 Desgn and Analyss of Algorthms Kent State Unversty Dept. of Math & Computer Scence LECT-6 Dynamc Programmng 2 Dynamc Programmng Dynamc Programmng, lke the dvde-and-conquer method, solves problems

More information

Cluster Analysis of Electrical Behavior

Cluster Analysis of Electrical Behavior Journal of Computer and Communcatons, 205, 3, 88-93 Publshed Onlne May 205 n ScRes. http://www.scrp.org/ournal/cc http://dx.do.org/0.4236/cc.205.350 Cluster Analyss of Electrcal Behavor Ln Lu Ln Lu, School

More information

Evaluation of Segmentation in Magnetic Resonance Images Using k-means and Fuzzy c-means Clustering Algorithms

Evaluation of Segmentation in Magnetic Resonance Images Using k-means and Fuzzy c-means Clustering Algorithms ELEKTROTEHIŠKI VESTIK 79(3): 129-134, 2011 EGLISH EDITIO Evaluaton of Segmentaton n Magnet Resonane Images Usng k-means and Fuzzy -Means Clusterng Algorthms Tomaž Fnkšt Unverza v Lublan, Fakulteta za stronštvo,

More information

MULTISPECTRAL IMAGES CLASSIFICATION BASED ON KLT AND ATR AUTOMATIC TARGET RECOGNITION

MULTISPECTRAL IMAGES CLASSIFICATION BASED ON KLT AND ATR AUTOMATIC TARGET RECOGNITION MULTISPECTRAL IMAGES CLASSIFICATION BASED ON KLT AND ATR AUTOMATIC TARGET RECOGNITION Paulo Quntlano 1 & Antono Santa-Rosa 1 Federal Polce Department, Brasla, Brazl. E-mals: quntlano.pqs@dpf.gov.br and

More information

Helsinki University Of Technology, Systems Analysis Laboratory Mat Independent research projects in applied mathematics (3 cr)

Helsinki University Of Technology, Systems Analysis Laboratory Mat Independent research projects in applied mathematics (3 cr) Helsnk Unversty Of Technology, Systems Analyss Laboratory Mat-2.08 Independent research projects n appled mathematcs (3 cr) "! #$&% Antt Laukkanen 506 R ajlaukka@cc.hut.f 2 Introducton...3 2 Multattrbute

More information

Announcements. Supervised Learning

Announcements. Supervised Learning Announcements See Chapter 5 of Duda, Hart, and Stork. Tutoral by Burge lnked to on web page. Supervsed Learnng Classfcaton wth labeled eamples. Images vectors n hgh-d space. Supervsed Learnng Labeled eamples

More information

A Modified Median Filter for the Removal of Impulse Noise Based on the Support Vector Machines

A Modified Median Filter for the Removal of Impulse Noise Based on the Support Vector Machines A Modfed Medan Flter for the Removal of Impulse Nose Based on the Support Vector Machnes H. GOMEZ-MORENO, S. MALDONADO-BASCON, F. LOPEZ-FERRERAS, M. UTRILLA- MANSO AND P. GIL-JIMENEZ Departamento de Teoría

More information

Face Recognition University at Buffalo CSE666 Lecture Slides Resources:

Face Recognition University at Buffalo CSE666 Lecture Slides Resources: Face Recognton Unversty at Buffalo CSE666 Lecture Sldes Resources: http://www.face-rec.org/algorthms/ Overvew of face recognton algorthms Correlaton - Pxel based correspondence between two face mages Structural

More information

An Adaptive Filter Based on Wavelet Packet Decomposition in Motor Imagery Classification

An Adaptive Filter Based on Wavelet Packet Decomposition in Motor Imagery Classification An Adaptve Flter Based on Wavelet Paket Deomposton n Motor Imagery Classfaton J. Payat, R. Mt, T. Chusak, and N. Sugno Abstrat Bran-Computer Interfae (BCI) s a system that translates bran waves nto eletral

More information

UB at GeoCLEF Department of Geography Abstract

UB at GeoCLEF Department of Geography   Abstract UB at GeoCLEF 2006 Mguel E. Ruz (1), Stuart Shapro (2), June Abbas (1), Slva B. Southwck (1) and Davd Mark (3) State Unversty of New York at Buffalo (1) Department of Lbrary and Informaton Studes (2) Department

More information

CHAPTER 2 PROPOSED IMPROVED PARTICLE SWARM OPTIMIZATION

CHAPTER 2 PROPOSED IMPROVED PARTICLE SWARM OPTIMIZATION 24 CHAPTER 2 PROPOSED IMPROVED PARTICLE SWARM OPTIMIZATION The present chapter proposes an IPSO approach for multprocessor task schedulng problem wth two classfcatons, namely, statc ndependent tasks and

More information

Unsupervised Learning

Unsupervised Learning Pattern Recognton Lecture 8 Outlne Introducton Unsupervsed Learnng Parametrc VS Non-Parametrc Approach Mxture of Denstes Maxmum-Lkelhood Estmates Clusterng Prof. Danel Yeung School of Computer Scence and

More information

Range images. Range image registration. Examples of sampling patterns. Range images and range surfaces

Range images. Range image registration. Examples of sampling patterns. Range images and range surfaces Range mages For many structured lght scanners, the range data forms a hghly regular pattern known as a range mage. he samplng pattern s determned by the specfc scanner. Range mage regstraton 1 Examples

More information

Unsupervised Learning and Clustering

Unsupervised Learning and Clustering Unsupervsed Learnng and Clusterng Supervsed vs. Unsupervsed Learnng Up to now we consdered supervsed learnng scenaro, where we are gven 1. samples 1,, n 2. class labels for all samples 1,, n Ths s also

More information

Clustering incomplete data using kernel-based fuzzy c-means algorithm

Clustering incomplete data using kernel-based fuzzy c-means algorithm Clusterng noplete data usng ernel-based fuzzy -eans algorth Dao-Qang Zhang *, Song-Can Chen Departent of Coputer Sene and Engneerng, Nanjng Unversty of Aeronauts and Astronauts, Nanjng, 210016, People

More information

Multiscale Heterogeneous Modeling with Surfacelets

Multiscale Heterogeneous Modeling with Surfacelets 759 Multsale Heterogeneous Modelng wth Surfaelets Yan Wang 1 and Davd W. Rosen 2 1 Georga Insttute of Tehnology, yan.wang@me.gateh.edu 2 Georga Insttute of Tehnology, davd.rosen@me.gateh.edu ABSTRACT Computatonal

More information

A Novel Dynamic and Scalable Caching Algorithm of Proxy Server for Multimedia Objects

A Novel Dynamic and Scalable Caching Algorithm of Proxy Server for Multimedia Objects Journal of VLSI Sgnal Proessng 2007 * 2007 Sprnger Sene + Busness Meda, LLC. Manufatured n The Unted States. DOI: 10.1007/s11265-006-0024-7 A Novel Dynam and Salable Cahng Algorthm of Proxy Server for

More information