Learning General Gaussian Kernels by Optimizing Kernel Polarization

Size: px
Start display at page:

Download "Learning General Gaussian Kernels by Optimizing Kernel Polarization"

Transcription

1 Chnese Journal of Electroncs Vol.18, No.2, Apr Learnng General Gaussan Kernels by Optmzng Kernel Polarzaton WANG Tnghua 1, HUANG Houkuan 1, TIAN Shengfeng 1 and DENG Dayong 2 (1.School of Computer and Informaton Technology, Bejng Jaotong Unversty, Bejng , Chna (2.School of Mathematcs and Informaton Engneerng, Zhejang Normal Unversty, Jnhua , Chna Abstract The problem of model selecton for Support vector machnes (SVM wth general Gaussan kernels s consdered. Unlke the conventonal standard sngle scale Gaussan kernels, where all the bass functons have a common kernel wdth, the general Gaussan kernels adopt some lnear transformatons of nput space such that not only the scalng but also the rotaton s adapted. We proposed a gradent-based method for learnng the optmal general Gaussan kernels by optmzng kernel polarzaton. Ths method can fnd a more powerful kernel for a gven classfcaton problem wthout desgnng any classfer. Experments on both synthetc and real data sets demonstrate that tunng of the scalng and rotaton of Gaussan kernels usng our method can yeld better generalzaton performance of support vector machnes. Key words General Gaussan kernels, Kernel polarzaton, Support vector machnes (SVM, Model selecton. I. Introducton Kernel methods such as Support vector machnes (SVM [1,2] have been successfully appled for a wde range of dfferent data analyss problems. The key to success s that kernel methods can be modularzed nto two modules: the frst s to map data nto a feature space, whch s a powerful framework for many dfferent data types; and the second s to use a lnear algorthm n the feature space, whch s effcent and has theoretcal guarantees [2]. The methods take advantage of a problem specfc kernel functon that computes the nner product of nput data ponts n a feature space. Choosng the approprate kernel functon, thereby, the approprate feature space s the crucal step n handlng a learnng task wth kernel machnes. Ths problem s well known as the model selecton ssue for all kernel methods. Often a parameterzed famly of kernel functons s consdered and the problem reduces to fndng an approprate parameter vector for the gven task. Perhaps the most elaborate systematc technque of choosng multple parameters for a gven kernel s gradent-based methods [3 6]. These algorthms terate the followng procedure: the SVM s traned usng current parameter vector to provde Lagrange multplers, the gradent of some generalzaton error bound such as radus-margn bound wth respect to the parameters s calculated and an update rule s learned. However, ths approach has a sgnfcant drawback. In each teraton, t requre the tranng of the learnng machne and the soluton of an addtonal quadratc program to compute the radus of the smallest ball enclosng the tranng data n feature space, whch may result n hgh computatonal expense consumpton for some problems. In ths paper, we propose a smlar method for parameter selecton that does not suffer the drawback descrbed above. In Ref.[7], Baram proposed a new kernel optmalty crteron,.e. kernel polarzaton, for kernel adaptaton. Drawn from physcs, kernel polarzaton means to drve ponts wth dfferent class labels to opposte locatons n the feature space defned by kernel geometry. Independently derved, kernel polarzaton smplfes the kernel-target algnment [8 10] crteron by rddng the latter of ts denomnator, makng the optmzaton problem consderably easer. The sgnfcant property of optmzng kernel polarzaton for model selecton s that t s ndependent of the actual learnng machne used. It makes use of the nformaton from the complete tranng data and can be computed effcently. We adopt the kernel polarzaton crteron to learn the optmal general Gaussan kernels, where not only the scalng but also the rotaton s adapted. The rest of ths artcle s organzed as follows. In Secton II, we gve a bref ntroducton to SVM. In Secton III we frst revew the kernel polarzaton crteron and then gve a further nvestgaton of ts geometrc sgnfcance. The proposed technque for the adaptaton of general Gaussan kernels s presented n Secton IV. Expermental results are presented n Secton V and fnally we draw conclusons wth some future works n Secton VI. II. Support Vector Machnes We consder L 1-norm soft margn support vector machnes for bnary classfcaton. The fundamental concepts of SVM were developed by Vapnk [1]. Let (x, y, 1 l, be the Manuscrpt Receved Jan. 2008; Accepted Oct Ths work s supported by the Natonal Grand Fundamental Research 973 Program of Chna (No.2007CB307100, No.2007CB and the Specalzed Research Foundaton of Doctoral Program of Hgher Educaton of Chna (No

2 266 Chnese Journal of Electroncs 2009 tranng samples, where y {+1, 1} s the label assocated wth nput pattern x X R n. The man dea of SVM s to map the nput patterns to a feature space F and then to separate the transformed data lnearly n F. The transformaton φ : X F s mplctly done by a kernel functon k : X X R, whch computes an nner product n the feature space effcently, that s k(x, x j = φ(x, φ(x j. Patterns are classfed by a decson functon f of the form ( f(x = sgn a y k(x, x + b (1 The coeffcent a are computed by maxmzng the dual optmzaton problem max s.t. a 1 2 j=1 y y ja a jk(x, x j a y = 0 and 0 a C, = 1, 2,, l (2 where C s a regularzaton parameter whch controls the trade-off between maxmzng the target margn and mnmzng the L 1-norm of the margn slack vector of the tranng data. The bas b n Eq.(1 then can be determned based on the soluton a. The patterns x wth a > 0 are called support vectors. The kernel appearng n the problem has to be a postve defnte functon,.e., satsfyng the Mercer s condton. Mercer s condton tells us whether or not a kernel s actually an nner product n a gven space and ensures that the soluton of Eq.(2 produces a global optmum. The well-known kernels nclude lnear kernel, polynomal kernel, Gaussan kernel, sgmod kernel and so on. In the followng we consder the general Gaussan kernels k A(x, x j = exp( (x x j T A(x x j (3 where A s a Postve semdefnte (PSD n n matrx. It s the extenson of the standard sngle scale Gaussan kernels ( x xj 2 k(x, x j = exp (4 2σ 2 where σ s the kernel wdth parameter. III. Kernel Polarzaton Gven a tranng data set (x, y, 1 l, where x X and y {+1, 1}, the kernel functon k : X X R defnes a symmetrc postve defnte kernel matrx (Gram matrx K R l l by K j = k(x, x j. Let y = (y 1,, y l T, the deal kernel matrx would be yy T whch perfectly suts the tranng data. Kernel polarzaton of K s defned as the Frobenus nner product: ϕ = K, yy T = It can be rewrtten as ϕ = j=1 y =y j k(x, x j y y jk(x, x j (5 y y j k(x, x j (6 The kernel polarzaton measures the smlarty of the kernel wth yy T on the tranng data. Obvously, ϕ wll ncrease f the smlarty represented by the kernel s large for nput patterns of the same class and small for patterns from dfferent classes. Ths s the ntutve dea behnd preferrng a kernel wth hgh kernel polarzaton. The ease wth whch kernel polarzaton can be calculated n O(l 2 tme complexty usng only tranng data, pror to any computatonally ntensve tranng SVM, makes t an nterestng qualty crteron for kernel selecton. Employed by SVM, kernel polarzaton for the Gaussan kernel wth sngle scale parameter was found to yeld about the same classfcaton results as exhaustve parameter search. Furthermore, t has been shown that complete kernel polarzaton yelds consstent classfcaton by kernel-sum classfers, whch wll motvate kernel polarzaton from a theoretcal vewpont [7]. Now let us nvestgate further the geometrc sgnfcance of kernel polarzaton wth general Gaussan kernels. we reformulate kernel polarzaton n a parwse manner n feature space. Suppose y 1 = = y l+ = 1 and y l+ +1 = = y l+ +l = 1, where l +(l s the number of samples whose class label s +1( 1, l + + l = l. Note that k A(x, x = 1 for all x X and φ(x φ(x j 2 =k A(x, x 2k A(x, x j + k A(x j, x j =2 2k A(x, x j (7 substtutng Eq.(7 nto Eq.(5 we then get ϕ = l l 2 2l +l where B j = ( B j φ(x φ(x j 2 (8 { 1, y = y j +1, y y j (9 The above representaton gves us a clear nterpretaton of hgh kernel polarzaton: t tres to keep wthn-class data pars close (snce B j s negatve f y = y j and between-class data pars apart (snce B j s postve f y y j. Generally speakng, t s a suffcent condton that havng a hgh ϕ, one can expect good classfcaton performance, as the elements n the same class come close whereas those n dfferent classes go far n the feature space. IV. Implementaton for Kernel Adaptaton By usng the map P P T P = A, the general Gaussan kernels can be equvalently reformulated as k P (x, x j = exp( (x x j T P P T (x x j (10 where P s an arbtrary n n real-valued matrx. Unlke the work done by Glasmachers and Igel [11], where P was restrcted to a postve defnte symmetrc n n matrx. Ths extenson of P can reduce our optmzaton problem to an unconstraned one. Three dfferent kernel parameterzatons are consdered: (1 P = λe, where E s the unt matrx and λ the only parameter to be adapted. Ths case s just the standard Gaussan kernel wth sngle wdth parameter.

3 Learnng General Gaussan Kernels by Optmzng Kernel Polarzaton 267 (2 P s a dagonal matrx. Ths case means ndependent scalngs of the axes are adapted and corresponds to learn dfferent weghts of dfferent axes (features. (3 P s an arbtrary matrx. Ths case means the scalng and rotaton of the nput space s adapted and corresponds to learn possble correlaton among features. Adaptng P changes three propertes of the kernel: the sze, whch s controlled by the determnant of P, s defned as the smallest volume where a certan amount of the kernel s concentrated; the shape s determned by the egenvalues of P ; and the orentaton, when non-dagonal matrces P are allowed. We lst the three cases of P and name them Sngle, Dagonal and Full respectvely n Table 1. Table 1. Three cases of kernel parameterzatons Name Constrant #Parameters Impact on kernel Sngle P = λe 1 sze Dagonal P dagonal n sze & shape Full none n 2 sze, shape & orentaton We propose a gradent ascent algorthm to learn the matrx P by maxmzng the kernel polarzaton. Gven a tranng set (x, y, 1 l, where x X and y {+1, 1}, the kernel polarzaton wth general Gaussan kernels can be wrtten as ϕ(p = k P (x, x jy y j (11 j=1 hence the strategy to learn matrx P can be summarzed to choose P satsfyng P = arg max ϕ(p (12 P The dervatve of the ϕ(p wth respect to P s ϕ(p P = j=1 y y j k P (x, x j P (13 the partal of k P (x, x j s the dervatve of a scalar functon wth respect to the matrx P. As such, the dervatve s a matrx where each element s the dervatve wth respect to each element of P. Frstly, we gve a lemma as follows: Lemma 1 Let U j = x T P P T x j, the frst dervatve of U j wth respect to P s we get Proof Let U j P = (x, xt J + x jx T P (14 x = (x (1, x (2,, x (n T, x j = (x (1 j, x (2 j,, x (n j T P = (p rs n n U j = x T P P T x j = r=1 s=1 t=1 x (r x (t j prspts the frst dervatve of U j wth respect to p rs can be expressed as U j = x (r (x (t j pts + x(r j (x (t p ts. p rs t=1 t=1 Ths expresson s obvously equvalent to the Eq.(14. Secondly, we rewrte Eq.(10 as k P (x, x j = exp( (x T P P T x 2x T P P T x j + x T j P P T x j (15 Accordng to Eqs.(13, (14 and (15, we have ϕ(p P = ( U Uj y y j (k P (x, x j 2 P P + Ujj P (16 Therefore, the gradent of ϕ(p wth respect to P has been successfully fgured out. In the specal case of dagonal P,.e. P = dag(p (1, p (2,, p (n let x = (x (1, x (2,, x (n T, we have ( k P (x, x j = exp ((x (k k=1 + (x (k j p (k 2 Thus the gradent wth respect to p (k s k p(x, x j p (k = 2p (k k p(x, x j((x (k p (k 2 2x (k x (k j (p (k 2 (17 2 2x (k x (k j + (x (k j 2 (18 Fnally, the procedure for learnng the matrx P s summarzed as follows: Procedure 1 (1 Intalze P to a random matrx; (2 Update P such that ϕ(p s maxmzed. Ths can be acheved by a gradent update rule: P P + η P ϕ(p (3 Go to step (2 or stop f a gven stoppng crteron s met; (4 End. where η > 0 s the gradent step and P ϕ(p s the gradent of ϕ(p wth respect to P. In order to smplfy the optmzaton problem, no constrant s placed on P, although Eq.(15 mples that the matrx P P T must be postve defnte. To deal wth ths problem, we ntalze P P T to a postve defnte matrx, for example, we ntalze P to the unt matrx I. We now consder the computatonal complexty of the proposed algorthm. For the case that P s an arbtrary matrx (Full, t can be seen that the tme complexty of Eq.(14 s O(n 3 and Eq.(15 s O(n 2, hence the tme complexty of Eq.(16 s O(l 2 n 3. Therefore, the tme complexty of Procedure 1 s O(kl 2 n 3, where k denotes the teraton tmes of the teratve procedure. Smlarty, the tme complexty of adaptng the dagonal matrx P s O(kl 2 n 2. It can be observed that the overall complexty of our algorthm s somewhat hgh, especally for the arbtrary matrx case. However, t can be easly seen that the computatonal burden manly les n the matrx multplcaton (see Eq.(14. Matrx multplcaton, due to ts fundamental mportance n scence and engneerng, much effort has been devoted to fndng and mplementng fast matrx

4 268 Chnese Journal of Electroncs 2009 multplcaton algorthms [12,13]. As we mentoned above, the standard sequental algorthm has O(n 3 tme complexty n multplyng two n n matrces. Snce Strassen s O(n algorthm was dscovered n 1969, successve progress has been made to develop fast sequental matrx multplcaton algorthms wth tme complexty O(n β, where 2 < β < 3. The current best value of β s less than By some parallel and dstrbuted computng technology, the complexty of such algorthms can be at most reduced to O(log n tme. Therefore, our algorthm can be computed n O(kl 2 log n tme by usng the state-of-the-art matrx multplcaton algorthms. More and detaled dscussons for fast matrx multplcaton are beyond the range of ths artcle. V. Experments In ths secton, we performed experments on two synthetc data sets and three real-world data sets, all of whch are bnary classfcaton task. The synthetc data sets were generated by PRTools 4 [14] and the real-world data sets were selected from the UCI collectons [15]. The two synthetc data sets were nomnated as Dffcult and Hghleyman respectvely, whose data dstrbutons are shown n Fg.1. For the Breast cancer data set, we drectly elmnated the examples that contan mssng attrbute values. The fourth data set LetterIJ was derved from the Letter mage recognton data, we set up a bnary classfcaton task between the letters I and J whch s regarded to be a somewhat challengng problem. For the last data set Irs plants, we set up a bnary classfcaton data set Irs23 between the class Irs verscolour and Irs vrgnca, whch s not lnear dscrmnable. For each data set we parttoned them nto a tranng set and a test set: 30% of the data set serves as tranng set and 70% as test set. The object generaton follows the class pror probabltes. For each of the cases shown n Table 1, we used the tranng set to perform optmal kernel selecton, and then tested the generalzaton performance of SVM usng the test set. We stopped our algorthm when the mprovement of the kernel polarzaton was less than 10 8 and averaged our results over 10 runs. The mplementaton of the SVM classfer was acheved by the software LIBSVM [16]. Fg. 1. Data dstrbutons of two synthetc data sets Table 2 presents the results for three dfferent kernel parameterzatons: Sngle, Dagonal and Full. We lst the classfcaton error and the number of support vectors when gven a slack penalty of C = 1. The average over all data sets s shown n the last row of the table, too. For all fve data sets, we can easly fnd that better results are acheved when the kernel adaptaton s not restrcted to the Sngle case. Furthermore, the Full case leads to better results compared to the Dagonal case. From an average perspectve, the Sngle, Dagonal and Full cases provde respectvely a testng error of 7.60%, 5.85% and 3.77%. There s another remarkable advantage of the Dagonal and Full cases: the number of support vectors decrease. Agan from an average perspectve, the Sngle, Dagonal and Full cases provde respectvely the number of support vectors of 59, 48 and 37. It s well known that the decrease of the number of support vectors can mprove the classfcaton speed n practce. Table 2. Performance results on fve data sets Sngle Dagonal Full Data set Error nsvs Error nsvs Error nsvs Dffcult 10.71% % % 14 Hghleyman 10.71% % % 17 Breast cancer 5.03% % % 30 LetterIJ 5.86% % % 113 Irs % % % 10 Avg. 7.60% % % 37 Fg. 2. Classfcaton error rate on Irs23 data set Fg. 3. Number of support vectors on Irs23 data set Note that parameter C was not optmzed n any way, we vared C over a range C = 1, 10, 30, 50, 70, 100 to test performance on data set Irs23 and showed the results n Fg.2 and Fg.3. For the other data sets, the smlar results were obtaned. It can be found that for all the values of C consdered the Dagonal case as well as the Full case leads to better results compared to the Sngle case: both classfcaton error and number of support vectors decrease. The property that the Full case gves better results than Dagonal case s also preserved. In Ref.[7], kernel polarzaton for the Gaussan kernel wth sngle scale parameter (sphercal kernel has been shown to yeld about the same classfcaton results as exhaustve parameter search. However, t can be seen that only by droppng the restrcton of sngle scale parameter we can acheve a lnear transformaton of the nput space such that n the transformed space Gaussan kernels can perform better. Ths phenomenon s due to the nherent advantage of the optmzaton crteron whch s dscussed n Secton III and the fner data dstrbuton model, n other words a more approprate (separate embeddng, whch was obtaned va adaptaton of the matrx P. In a word, we have emprcally llustrated the effectveness of our method and confrmed further that kernel polarzaton s an effectve crteron for kernel selecton.

5 Learnng General Gaussan Kernels by Optmzng Kernel Polarzaton 269 VI. Concluson The choce of the kernel functon whch determnes mplctly the mappng between the nput space and the feature space s of crucal mportance to kernel methods. In ths study we proposed an approach for learnng general Gaussan kernels, where not only the scalng but also the rotaton s adapted. Ths s based on the possblty of computng the gradent of kernel polarzaton wth respect to a lnear transformaton matrx. The ease wth whch kernel polarzaton can be calculated usng only tranng data, pror to desgnng any classfer, makes t an nterestng crteron for kernel selecton. Furthermore, we presented a further nvestgaton of the geometrc sgnfcance on maxmzng kernel polarzaton, whch ntutvely shows why we choose kernel polarzaton as the crteron for kernel selecton. As for SVM, expermental results on both synthetc and real data sets show our method leads to both an mprovement of the classfcaton accuracy and a reducton of the number of support vectors. Future nvestgaton of ths work wll focus on regresson and clusterng applcatons. Snce we employ the Gaussan kernels throughout, the extensons to other kernel functons may be another nterest n our future work. References [1] V. Vapnk, The Nature of Statstcal Learnng Theory, SprngerVerlag, New York, USA, [2] J. Shawe-Taylor, N. Crstann, Kernel Methods for Pattern Analyss, Cambrdge Unversty Press, Cambrdge, UK, [3] O. Chapelle, V. Vapnk, O. Bousquet, S. Mukherjee, Choosng multple parameters for support vector machnes, Machne Learnng, Vol.46, No.1, pp , [4] S.S. Keerth, Effcent tunng of SVM hyperparameters usng radus/margn bound and teratve algorthms, IEEE Trans. on Neural Networks, Vol.13, No.5, pp , [5] K.M. Chung, W.C. Kao, C.L. Sun, L.L. Wang, C.J. Ln, Radus margn bounds for support vector machnes wth RBF kernel, Neural Computaton, Vol.15, No.11, pp , [6] Chang Qun, Wang Xaolong, Ln Ymeng, Wang Xzhao, D.S. Yeung, Support vector classfcaton and Gaussan kernel wth multple wdths, Acta Electronca Snca, Vol.35, No.3, pp , (n Chnese [7] Y. Baram, Learnng by kernel polarzaton, Neural Computaton, Vol.17, No.6, pp , [8] N. Crstann, J. Shawe-Taylor, A. Elsseeff, J. Kandola, On kernel-target algnment, Proc. of Neural Informaton Processng Systems (NIPS: Natural and Synthetc, Vancouver, Brtsh Columba, Canada, pp , [9] J.B. Pothn, C. Rchard, A greedy algorthm for optmzng the kernel algnment and the performance of kernel machnes, Proc. of 14th European Sgnal Processng Conference (EUSIPCO, Florence, Italy, pp.4 8, [10] C. Lgel, T. Glasmachers, B. Mersch, N. Pfefer, P. Mencke, Gradent-based optmzaton of kernel-target algnment for sequence kernels appled to bacteral gene start detecton, IEEE/ACM Transactons on Computatonal Bology and Bonformatcs, Vol.4, No.2, pp , [11] T. Glasmachers, C. Igel, Gradent-based adaptaton of general Gaussan kernels, Neural Computaton, Vol.17, No.10, pp , [12] K. L, Scalable parallel matrx multplcaton on dstrbuted memory parallel computers, Journal of Parallel and Dstrbuted Computng, Vol.61, No.12, pp , [13] K. Goto, R.V.D. Gejn, Anatomy of hgh-performance matrx multplcaton, ACM Trans. on Mathematcal Software, Vol.34, No.3, pp.12 25, [14] R.P.W. Dun, P. Juszczak, P. Paclk, E. Pekalska, D.D. Rdderet, D.M.J. Tax, PRTools: A Matlab Toolbox for Pattern Recognton, Delft Unversty of Technology, Software avalable at [15] A. Asuncon, D.J. Newman, UCI Machne Learnng Repostory [ Irvne, CA: Unversty of Calforna, School of Informaton and Computer Scence, [16] C.C. Chang, C.J. Ln, LIBSVM: A Lbrary for Support Vector Machnes, Software avalable at tw/ cjln/lbsvm, WANG Tnghua was born n He dd hs undergraduate study at Nanchang Unversty majorng n computer scence durng , and receved the M.S. degree n computer scence n 2006 from Nanchang Unversty. He s studyng for Ph.D. degree at School of Computer and Informaton Technology, Bejng Jaotong Unversty, Chna. Hs current research s focus on machne learnng and data mnng, especally on support vector machnes and kernel-based learnng methods. (Emal: wthbjtu@163.com HUANG Houkuan was born n He dd hs undergraduate study at Pekng Unversty majorng n mathematcs durng , and graduate study n Harbn Engneerng Unversty majorng n appled mathematcs durng He s currently a professor and Ph.D. supervsor at School of Computer and Informaton Technology, Bejng Jaotong Unversty, Chna. Hs current nterests nclude artfcal ntellgence, KDD and mult-agent systems. TIAN Shengfeng was born n He s a professor and Ph.D. supervsor at School of Computer and Informaton Technology, Bejng Jaotong Unversty, Chna. Hs current research s focus on artfcal ntellgence, pattern recognton and network securty, especally on support vector machnes and network ntruson detecton. DENG Dayong was born n He receved the Ph.D. degree n computer scence n 2007 from Bejng Jaotong Unversty. He s currently an assstant professor at School of Mathematcs and Informaton Engneerng, Zhejang Normal Unversty, Chna. Hs current research nterests nclude rough set and data mnng.

Learning the Kernel Parameters in Kernel Minimum Distance Classifier

Learning the Kernel Parameters in Kernel Minimum Distance Classifier Learnng the Kernel Parameters n Kernel Mnmum Dstance Classfer Daoqang Zhang 1,, Songcan Chen and Zh-Hua Zhou 1* 1 Natonal Laboratory for Novel Software Technology Nanjng Unversty, Nanjng 193, Chna Department

More information

Support Vector Machines

Support Vector Machines Support Vector Machnes Decson surface s a hyperplane (lne n 2D) n feature space (smlar to the Perceptron) Arguably, the most mportant recent dscovery n machne learnng In a nutshell: map the data to a predetermned

More information

Machine Learning. Support Vector Machines. (contains material adapted from talks by Constantin F. Aliferis & Ioannis Tsamardinos, and Martin Law)

Machine Learning. Support Vector Machines. (contains material adapted from talks by Constantin F. Aliferis & Ioannis Tsamardinos, and Martin Law) Machne Learnng Support Vector Machnes (contans materal adapted from talks by Constantn F. Alfers & Ioanns Tsamardnos, and Martn Law) Bryan Pardo, Machne Learnng: EECS 349 Fall 2014 Support Vector Machnes

More information

Classification / Regression Support Vector Machines

Classification / Regression Support Vector Machines Classfcaton / Regresson Support Vector Machnes Jeff Howbert Introducton to Machne Learnng Wnter 04 Topcs SVM classfers for lnearly separable classes SVM classfers for non-lnearly separable classes SVM

More information

Support Vector Machines

Support Vector Machines /9/207 MIST.6060 Busness Intellgence and Data Mnng What are Support Vector Machnes? Support Vector Machnes Support Vector Machnes (SVMs) are supervsed learnng technques that analyze data and recognze patterns.

More information

Outline. Discriminative classifiers for image recognition. Where in the World? A nearest neighbor recognition example 4/14/2011. CS 376 Lecture 22 1

Outline. Discriminative classifiers for image recognition. Where in the World? A nearest neighbor recognition example 4/14/2011. CS 376 Lecture 22 1 4/14/011 Outlne Dscrmnatve classfers for mage recognton Wednesday, Aprl 13 Krsten Grauman UT-Austn Last tme: wndow-based generc obect detecton basc ppelne face detecton wth boostng as case study Today:

More information

Lecture 5: Multilayer Perceptrons

Lecture 5: Multilayer Perceptrons Lecture 5: Multlayer Perceptrons Roger Grosse 1 Introducton So far, we ve only talked about lnear models: lnear regresson and lnear bnary classfers. We noted that there are functons that can t be represented

More information

Parallelism for Nested Loops with Non-uniform and Flow Dependences

Parallelism for Nested Loops with Non-uniform and Flow Dependences Parallelsm for Nested Loops wth Non-unform and Flow Dependences Sam-Jn Jeong Dept. of Informaton & Communcaton Engneerng, Cheonan Unversty, 5, Anseo-dong, Cheonan, Chungnam, 330-80, Korea. seong@cheonan.ac.kr

More information

The Research of Support Vector Machine in Agricultural Data Classification

The Research of Support Vector Machine in Agricultural Data Classification The Research of Support Vector Machne n Agrcultural Data Classfcaton Le Sh, Qguo Duan, Xnmng Ma, Me Weng College of Informaton and Management Scence, HeNan Agrcultural Unversty, Zhengzhou 45000 Chna Zhengzhou

More information

Subspace clustering. Clustering. Fundamental to all clustering techniques is the choice of distance measure between data points;

Subspace clustering. Clustering. Fundamental to all clustering techniques is the choice of distance measure between data points; Subspace clusterng Clusterng Fundamental to all clusterng technques s the choce of dstance measure between data ponts; D q ( ) ( ) 2 x x = x x, j k = 1 k jk Squared Eucldean dstance Assumpton: All features

More information

Cluster Analysis of Electrical Behavior

Cluster Analysis of Electrical Behavior Journal of Computer and Communcatons, 205, 3, 88-93 Publshed Onlne May 205 n ScRes. http://www.scrp.org/ournal/cc http://dx.do.org/0.4236/cc.205.350 Cluster Analyss of Electrcal Behavor Ln Lu Ln Lu, School

More information

Outline. Type of Machine Learning. Examples of Application. Unsupervised Learning

Outline. Type of Machine Learning. Examples of Application. Unsupervised Learning Outlne Artfcal Intellgence and ts applcatons Lecture 8 Unsupervsed Learnng Professor Danel Yeung danyeung@eee.org Dr. Patrck Chan patrckchan@eee.org South Chna Unversty of Technology, Chna Introducton

More information

Edge Detection in Noisy Images Using the Support Vector Machines

Edge Detection in Noisy Images Using the Support Vector Machines Edge Detecton n Nosy Images Usng the Support Vector Machnes Hlaro Gómez-Moreno, Saturnno Maldonado-Bascón, Francsco López-Ferreras Sgnal Theory and Communcatons Department. Unversty of Alcalá Crta. Madrd-Barcelona

More information

CHAPTER 3 SEQUENTIAL MINIMAL OPTIMIZATION TRAINED SUPPORT VECTOR CLASSIFIER FOR CANCER PREDICTION

CHAPTER 3 SEQUENTIAL MINIMAL OPTIMIZATION TRAINED SUPPORT VECTOR CLASSIFIER FOR CANCER PREDICTION 48 CHAPTER 3 SEQUENTIAL MINIMAL OPTIMIZATION TRAINED SUPPORT VECTOR CLASSIFIER FOR CANCER PREDICTION 3.1 INTRODUCTION The raw mcroarray data s bascally an mage wth dfferent colors ndcatng hybrdzaton (Xue

More information

12/2/2009. Announcements. Parametric / Non-parametric. Case-Based Reasoning. Nearest-Neighbor on Images. Nearest-Neighbor Classification

12/2/2009. Announcements. Parametric / Non-parametric. Case-Based Reasoning. Nearest-Neighbor on Images. Nearest-Neighbor Classification Introducton to Artfcal Intellgence V22.0472-001 Fall 2009 Lecture 24: Nearest-Neghbors & Support Vector Machnes Rob Fergus Dept of Computer Scence, Courant Insttute, NYU Sldes from Danel Yeung, John DeNero

More information

Unsupervised Learning

Unsupervised Learning Pattern Recognton Lecture 8 Outlne Introducton Unsupervsed Learnng Parametrc VS Non-Parametrc Approach Mxture of Denstes Maxmum-Lkelhood Estmates Clusterng Prof. Danel Yeung School of Computer Scence and

More information

Classifier Selection Based on Data Complexity Measures *

Classifier Selection Based on Data Complexity Measures * Classfer Selecton Based on Data Complexty Measures * Edth Hernández-Reyes, J.A. Carrasco-Ochoa, and J.Fco. Martínez-Trndad Natonal Insttute for Astrophyscs, Optcs and Electroncs, Lus Enrque Erro No.1 Sta.

More information

Incremental Learning with Support Vector Machines and Fuzzy Set Theory

Incremental Learning with Support Vector Machines and Fuzzy Set Theory The 25th Workshop on Combnatoral Mathematcs and Computaton Theory Incremental Learnng wth Support Vector Machnes and Fuzzy Set Theory Yu-Mng Chuang 1 and Cha-Hwa Ln 2* 1 Department of Computer Scence and

More information

Feature Reduction and Selection

Feature Reduction and Selection Feature Reducton and Selecton Dr. Shuang LIANG School of Software Engneerng TongJ Unversty Fall, 2012 Today s Topcs Introducton Problems of Dmensonalty Feature Reducton Statstc methods Prncpal Components

More information

CS434a/541a: Pattern Recognition Prof. Olga Veksler. Lecture 15

CS434a/541a: Pattern Recognition Prof. Olga Veksler. Lecture 15 CS434a/541a: Pattern Recognton Prof. Olga Veksler Lecture 15 Today New Topc: Unsupervsed Learnng Supervsed vs. unsupervsed learnng Unsupervsed learnng Net Tme: parametrc unsupervsed learnng Today: nonparametrc

More information

Determining the Optimal Bandwidth Based on Multi-criterion Fusion

Determining the Optimal Bandwidth Based on Multi-criterion Fusion Proceedngs of 01 4th Internatonal Conference on Machne Learnng and Computng IPCSIT vol. 5 (01) (01) IACSIT Press, Sngapore Determnng the Optmal Bandwdth Based on Mult-crteron Fuson Ha-L Lang 1+, Xan-Mn

More information

Solving two-person zero-sum game by Matlab

Solving two-person zero-sum game by Matlab Appled Mechancs and Materals Onlne: 2011-02-02 ISSN: 1662-7482, Vols. 50-51, pp 262-265 do:10.4028/www.scentfc.net/amm.50-51.262 2011 Trans Tech Publcatons, Swtzerland Solvng two-person zero-sum game by

More information

Support Vector Machines. CS534 - Machine Learning

Support Vector Machines. CS534 - Machine Learning Support Vector Machnes CS534 - Machne Learnng Perceptron Revsted: Lnear Separators Bnar classfcaton can be veed as the task of separatng classes n feature space: b > 0 b 0 b < 0 f() sgn( b) Lnear Separators

More information

Smoothing Spline ANOVA for variable screening

Smoothing Spline ANOVA for variable screening Smoothng Splne ANOVA for varable screenng a useful tool for metamodels tranng and mult-objectve optmzaton L. Rcco, E. Rgon, A. Turco Outlne RSM Introducton Possble couplng Test case MOO MOO wth Game Theory

More information

Network Intrusion Detection Based on PSO-SVM

Network Intrusion Detection Based on PSO-SVM TELKOMNIKA Indonesan Journal of Electrcal Engneerng Vol.1, No., February 014, pp. 150 ~ 1508 DOI: http://dx.do.org/10.11591/telkomnka.v1.386 150 Network Intruson Detecton Based on PSO-SVM Changsheng Xang*

More information

BAYESIAN MULTI-SOURCE DOMAIN ADAPTATION

BAYESIAN MULTI-SOURCE DOMAIN ADAPTATION BAYESIAN MULTI-SOURCE DOMAIN ADAPTATION SHI-LIANG SUN, HONG-LEI SHI Department of Computer Scence and Technology, East Chna Normal Unversty 500 Dongchuan Road, Shangha 200241, P. R. Chna E-MAIL: slsun@cs.ecnu.edu.cn,

More information

Positive Semi-definite Programming Localization in Wireless Sensor Networks

Positive Semi-definite Programming Localization in Wireless Sensor Networks Postve Sem-defnte Programmng Localzaton n Wreless Sensor etworks Shengdong Xe 1,, Jn Wang, Aqun Hu 1, Yunl Gu, Jang Xu, 1 School of Informaton Scence and Engneerng, Southeast Unversty, 10096, anjng Computer

More information

Discriminative Dictionary Learning with Pairwise Constraints

Discriminative Dictionary Learning with Pairwise Constraints Dscrmnatve Dctonary Learnng wth Parwse Constrants Humn Guo Zhuoln Jang LARRY S. DAVIS UNIVERSITY OF MARYLAND Nov. 6 th, Outlne Introducton/motvaton Dctonary Learnng Dscrmnatve Dctonary Learnng wth Parwse

More information

Hermite Splines in Lie Groups as Products of Geodesics

Hermite Splines in Lie Groups as Products of Geodesics Hermte Splnes n Le Groups as Products of Geodescs Ethan Eade Updated May 28, 2017 1 Introducton 1.1 Goal Ths document defnes a curve n the Le group G parametrzed by tme and by structural parameters n the

More information

Human Face Recognition Using Generalized. Kernel Fisher Discriminant

Human Face Recognition Using Generalized. Kernel Fisher Discriminant Human Face Recognton Usng Generalzed Kernel Fsher Dscrmnant ng-yu Sun,2 De-Shuang Huang Ln Guo. Insttute of Intellgent Machnes, Chnese Academy of Scences, P.O.ox 30, Hefe, Anhu, Chna. 2. Department of

More information

Face Recognition Based on SVM and 2DPCA

Face Recognition Based on SVM and 2DPCA Vol. 4, o. 3, September, 2011 Face Recognton Based on SVM and 2DPCA Tha Hoang Le, Len Bu Faculty of Informaton Technology, HCMC Unversty of Scence Faculty of Informaton Scences and Engneerng, Unversty

More information

S1 Note. Basis functions.

S1 Note. Basis functions. S1 Note. Bass functons. Contents Types of bass functons...1 The Fourer bass...2 B-splne bass...3 Power and type I error rates wth dfferent numbers of bass functons...4 Table S1. Smulaton results of type

More information

Announcements. Supervised Learning

Announcements. Supervised Learning Announcements See Chapter 5 of Duda, Hart, and Stork. Tutoral by Burge lnked to on web page. Supervsed Learnng Classfcaton wth labeled eamples. Images vectors n hgh-d space. Supervsed Learnng Labeled eamples

More information

Tsinghua University at TAC 2009: Summarizing Multi-documents by Information Distance

Tsinghua University at TAC 2009: Summarizing Multi-documents by Information Distance Tsnghua Unversty at TAC 2009: Summarzng Mult-documents by Informaton Dstance Chong Long, Mnle Huang, Xaoyan Zhu State Key Laboratory of Intellgent Technology and Systems, Tsnghua Natonal Laboratory for

More information

A Modified Median Filter for the Removal of Impulse Noise Based on the Support Vector Machines

A Modified Median Filter for the Removal of Impulse Noise Based on the Support Vector Machines A Modfed Medan Flter for the Removal of Impulse Nose Based on the Support Vector Machnes H. GOMEZ-MORENO, S. MALDONADO-BASCON, F. LOPEZ-FERRERAS, M. UTRILLA- MANSO AND P. GIL-JIMENEZ Departamento de Teoría

More information

BOOSTING CLASSIFICATION ACCURACY WITH SAMPLES CHOSEN FROM A VALIDATION SET

BOOSTING CLASSIFICATION ACCURACY WITH SAMPLES CHOSEN FROM A VALIDATION SET 1 BOOSTING CLASSIFICATION ACCURACY WITH SAMPLES CHOSEN FROM A VALIDATION SET TZU-CHENG CHUANG School of Electrcal and Computer Engneerng, Purdue Unversty, West Lafayette, Indana 47907 SAUL B. GELFAND School

More information

Sum of Linear and Fractional Multiobjective Programming Problem under Fuzzy Rules Constraints

Sum of Linear and Fractional Multiobjective Programming Problem under Fuzzy Rules Constraints Australan Journal of Basc and Appled Scences, 2(4): 1204-1208, 2008 ISSN 1991-8178 Sum of Lnear and Fractonal Multobjectve Programmng Problem under Fuzzy Rules Constrants 1 2 Sanjay Jan and Kalash Lachhwan

More information

R s s f. m y s. SPH3UW Unit 7.3 Spherical Concave Mirrors Page 1 of 12. Notes

R s s f. m y s. SPH3UW Unit 7.3 Spherical Concave Mirrors Page 1 of 12. Notes SPH3UW Unt 7.3 Sphercal Concave Mrrors Page 1 of 1 Notes Physcs Tool box Concave Mrror If the reflectng surface takes place on the nner surface of the sphercal shape so that the centre of the mrror bulges

More information

Classifying Acoustic Transient Signals Using Artificial Intelligence

Classifying Acoustic Transient Signals Using Artificial Intelligence Classfyng Acoustc Transent Sgnals Usng Artfcal Intellgence Steve Sutton, Unversty of North Carolna At Wlmngton (suttons@charter.net) Greg Huff, Unversty of North Carolna At Wlmngton (jgh7476@uncwl.edu)

More information

An Improved Image Segmentation Algorithm Based on the Otsu Method

An Improved Image Segmentation Algorithm Based on the Otsu Method 3th ACIS Internatonal Conference on Software Engneerng, Artfcal Intellgence, Networkng arallel/dstrbuted Computng An Improved Image Segmentaton Algorthm Based on the Otsu Method Mengxng Huang, enjao Yu,

More information

2x x l. Module 3: Element Properties Lecture 4: Lagrange and Serendipity Elements

2x x l. Module 3: Element Properties Lecture 4: Lagrange and Serendipity Elements Module 3: Element Propertes Lecture : Lagrange and Serendpty Elements 5 In last lecture note, the nterpolaton functons are derved on the bass of assumed polynomal from Pascal s trangle for the fled varable.

More information

Term Weighting Classification System Using the Chi-square Statistic for the Classification Subtask at NTCIR-6 Patent Retrieval Task

Term Weighting Classification System Using the Chi-square Statistic for the Classification Subtask at NTCIR-6 Patent Retrieval Task Proceedngs of NTCIR-6 Workshop Meetng, May 15-18, 2007, Tokyo, Japan Term Weghtng Classfcaton System Usng the Ch-square Statstc for the Classfcaton Subtask at NTCIR-6 Patent Retreval Task Kotaro Hashmoto

More information

Machine Learning 9. week

Machine Learning 9. week Machne Learnng 9. week Mappng Concept Radal Bass Functons (RBF) RBF Networks 1 Mappng It s probably the best scenaro for the classfcaton of two dataset s to separate them lnearly. As you see n the below

More information

Face Recognition Method Based on Within-class Clustering SVM

Face Recognition Method Based on Within-class Clustering SVM Face Recognton Method Based on Wthn-class Clusterng SVM Yan Wu, Xao Yao and Yng Xa Department of Computer Scence and Engneerng Tong Unversty Shangha, Chna Abstract - A face recognton method based on Wthn-class

More information

Unsupervised Learning and Clustering

Unsupervised Learning and Clustering Unsupervsed Learnng and Clusterng Supervsed vs. Unsupervsed Learnng Up to now we consdered supervsed learnng scenaro, where we are gven 1. samples 1,, n 2. class labels for all samples 1,, n Ths s also

More information

A Binarization Algorithm specialized on Document Images and Photos

A Binarization Algorithm specialized on Document Images and Photos A Bnarzaton Algorthm specalzed on Document mages and Photos Ergna Kavalleratou Dept. of nformaton and Communcaton Systems Engneerng Unversty of the Aegean kavalleratou@aegean.gr Abstract n ths paper, a

More information

Problem Definitions and Evaluation Criteria for Computational Expensive Optimization

Problem Definitions and Evaluation Criteria for Computational Expensive Optimization Problem efntons and Evaluaton Crtera for Computatonal Expensve Optmzaton B. Lu 1, Q. Chen and Q. Zhang 3, J. J. Lang 4, P. N. Suganthan, B. Y. Qu 6 1 epartment of Computng, Glyndwr Unversty, UK Faclty

More information

APPLICATION OF MULTIVARIATE LOSS FUNCTION FOR ASSESSMENT OF THE QUALITY OF TECHNOLOGICAL PROCESS MANAGEMENT

APPLICATION OF MULTIVARIATE LOSS FUNCTION FOR ASSESSMENT OF THE QUALITY OF TECHNOLOGICAL PROCESS MANAGEMENT 3. - 5. 5., Brno, Czech Republc, EU APPLICATION OF MULTIVARIATE LOSS FUNCTION FOR ASSESSMENT OF THE QUALITY OF TECHNOLOGICAL PROCESS MANAGEMENT Abstract Josef TOŠENOVSKÝ ) Lenka MONSPORTOVÁ ) Flp TOŠENOVSKÝ

More information

Face Recognition University at Buffalo CSE666 Lecture Slides Resources:

Face Recognition University at Buffalo CSE666 Lecture Slides Resources: Face Recognton Unversty at Buffalo CSE666 Lecture Sldes Resources: http://www.face-rec.org/algorthms/ Overvew of face recognton algorthms Correlaton - Pxel based correspondence between two face mages Structural

More information

TN348: Openlab Module - Colocalization

TN348: Openlab Module - Colocalization TN348: Openlab Module - Colocalzaton Topc The Colocalzaton module provdes the faclty to vsualze and quantfy colocalzaton between pars of mages. The Colocalzaton wndow contans a prevew of the two mages

More information

Using Neural Networks and Support Vector Machines in Data Mining

Using Neural Networks and Support Vector Machines in Data Mining Usng eural etworks and Support Vector Machnes n Data Mnng RICHARD A. WASIOWSKI Computer Scence Department Calforna State Unversty Domnguez Hlls Carson, CA 90747 USA Abstract: - Multvarate data analyss

More information

Optimizing Document Scoring for Query Retrieval

Optimizing Document Scoring for Query Retrieval Optmzng Document Scorng for Query Retreval Brent Ellwen baellwe@cs.stanford.edu Abstract The goal of ths project was to automate the process of tunng a document query engne. Specfcally, I used machne learnng

More information

The Shortest Path of Touring Lines given in the Plane

The Shortest Path of Touring Lines given in the Plane Send Orders for Reprnts to reprnts@benthamscence.ae 262 The Open Cybernetcs & Systemcs Journal, 2015, 9, 262-267 The Shortest Path of Tourng Lnes gven n the Plane Open Access Ljuan Wang 1,2, Dandan He

More information

Related-Mode Attacks on CTR Encryption Mode

Related-Mode Attacks on CTR Encryption Mode Internatonal Journal of Network Securty, Vol.4, No.3, PP.282 287, May 2007 282 Related-Mode Attacks on CTR Encrypton Mode Dayn Wang, Dongda Ln, and Wenlng Wu (Correspondng author: Dayn Wang) Key Laboratory

More information

CLASSIFICATION OF ULTRASONIC SIGNALS

CLASSIFICATION OF ULTRASONIC SIGNALS The 8 th Internatonal Conference of the Slovenan Socety for Non-Destructve Testng»Applcaton of Contemporary Non-Destructve Testng n Engneerng«September -3, 5, Portorož, Slovena, pp. 7-33 CLASSIFICATION

More information

Unsupervised Learning and Clustering

Unsupervised Learning and Clustering Unsupervsed Learnng and Clusterng Why consder unlabeled samples?. Collectng and labelng large set of samples s costly Gettng recorded speech s free, labelng s tme consumng 2. Classfer could be desgned

More information

LECTURE : MANIFOLD LEARNING

LECTURE : MANIFOLD LEARNING LECTURE : MANIFOLD LEARNING Rta Osadchy Some sldes are due to L.Saul, V. C. Raykar, N. Verma Topcs PCA MDS IsoMap LLE EgenMaps Done! Dmensonalty Reducton Data representaton Inputs are real-valued vectors

More information

An Optimal Algorithm for Prufer Codes *

An Optimal Algorithm for Prufer Codes * J. Software Engneerng & Applcatons, 2009, 2: 111-115 do:10.4236/jsea.2009.22016 Publshed Onlne July 2009 (www.scrp.org/journal/jsea) An Optmal Algorthm for Prufer Codes * Xaodong Wang 1, 2, Le Wang 3,

More information

Learning a Class-Specific Dictionary for Facial Expression Recognition

Learning a Class-Specific Dictionary for Facial Expression Recognition BULGARIAN ACADEMY OF SCIENCES CYBERNETICS AND INFORMATION TECHNOLOGIES Volume 16, No 4 Sofa 016 Prnt ISSN: 1311-970; Onlne ISSN: 1314-4081 DOI: 10.1515/cat-016-0067 Learnng a Class-Specfc Dctonary for

More information

Application of Learning Machine Methods to 3 D Object Modeling

Application of Learning Machine Methods to 3 D Object Modeling Applcaton of Learnng Machne Methods to 3 D Object Modelng Crstna Garca, and José Al Moreno Laboratoro de Computacón Emergente, Facultades de Cencas e Ingenería, Unversdad Central de Venezuela. Caracas,

More information

The Codesign Challenge

The Codesign Challenge ECE 4530 Codesgn Challenge Fall 2007 Hardware/Software Codesgn The Codesgn Challenge Objectves In the codesgn challenge, your task s to accelerate a gven software reference mplementaton as fast as possble.

More information

SLAM Summer School 2006 Practical 2: SLAM using Monocular Vision

SLAM Summer School 2006 Practical 2: SLAM using Monocular Vision SLAM Summer School 2006 Practcal 2: SLAM usng Monocular Vson Javer Cvera, Unversty of Zaragoza Andrew J. Davson, Imperal College London J.M.M Montel, Unversty of Zaragoza. josemar@unzar.es, jcvera@unzar.es,

More information

Intra-Parametric Analysis of a Fuzzy MOLP

Intra-Parametric Analysis of a Fuzzy MOLP Intra-Parametrc Analyss of a Fuzzy MOLP a MIAO-LING WANG a Department of Industral Engneerng and Management a Mnghsn Insttute of Technology and Hsnchu Tawan, ROC b HSIAO-FAN WANG b Insttute of Industral

More information

A Fast Visual Tracking Algorithm Based on Circle Pixels Matching

A Fast Visual Tracking Algorithm Based on Circle Pixels Matching A Fast Vsual Trackng Algorthm Based on Crcle Pxels Matchng Zhqang Hou hou_zhq@sohu.com Chongzhao Han czhan@mal.xjtu.edu.cn Ln Zheng Abstract: A fast vsual trackng algorthm based on crcle pxels matchng

More information

Content Based Image Retrieval Using 2-D Discrete Wavelet with Texture Feature with Different Classifiers

Content Based Image Retrieval Using 2-D Discrete Wavelet with Texture Feature with Different Classifiers IOSR Journal of Electroncs and Communcaton Engneerng (IOSR-JECE) e-issn: 78-834,p- ISSN: 78-8735.Volume 9, Issue, Ver. IV (Mar - Apr. 04), PP 0-07 Content Based Image Retreval Usng -D Dscrete Wavelet wth

More information

Evolutionary Support Vector Regression based on Multi-Scale Radial Basis Function Kernel

Evolutionary Support Vector Regression based on Multi-Scale Radial Basis Function Kernel Eolutonary Support Vector Regresson based on Mult-Scale Radal Bass Functon Kernel Tanasanee Phenthrakul and Boonserm Kjsrkul Abstract Kernel functons are used n support ector regresson (SVR) to compute

More information

Fuzzy Modeling of the Complexity vs. Accuracy Trade-off in a Sequential Two-Stage Multi-Classifier System

Fuzzy Modeling of the Complexity vs. Accuracy Trade-off in a Sequential Two-Stage Multi-Classifier System Fuzzy Modelng of the Complexty vs. Accuracy Trade-off n a Sequental Two-Stage Mult-Classfer System MARK LAST 1 Department of Informaton Systems Engneerng Ben-Guron Unversty of the Negev Beer-Sheva 84105

More information

Semi-supervised Classification Using Local and Global Regularization

Semi-supervised Classification Using Local and Global Regularization Proceedngs of the Twenty-Thrd AAAI Conference on Artfcal Intellgence (2008) Sem-supervsed Classfcaton Usng Local and Global Regularzaton Fe Wang 1, Tao L 2, Gang Wang 3, Changshu Zhang 1 1 Department of

More information

Journal of Chemical and Pharmaceutical Research, 2014, 6(6): Research Article. A selective ensemble classification method on microarray data

Journal of Chemical and Pharmaceutical Research, 2014, 6(6): Research Article. A selective ensemble classification method on microarray data Avalable onlne www.ocpr.com Journal of Chemcal and Pharmaceutcal Research, 2014, 6(6):2860-2866 Research Artcle ISSN : 0975-7384 CODEN(USA) : JCPRC5 A selectve ensemble classfcaton method on mcroarray

More information

CS246: Mining Massive Datasets Jure Leskovec, Stanford University

CS246: Mining Massive Datasets Jure Leskovec, Stanford University CS46: Mnng Massve Datasets Jure Leskovec, Stanford Unversty http://cs46.stanford.edu /19/013 Jure Leskovec, Stanford CS46: Mnng Massve Datasets, http://cs46.stanford.edu Perceptron: y = sgn( x Ho to fnd

More information

General Vector Machine. Hong Zhao Department of Physics, Xiamen University

General Vector Machine. Hong Zhao Department of Physics, Xiamen University General Vector Machne Hong Zhao (zhaoh@xmu.edu.cn) Department of Physcs, Xamen Unversty The support vector machne (SVM) s an mportant class of learnng machnes for functon approach, pattern recognton, and

More information

Polyhedral Compilation Foundations

Polyhedral Compilation Foundations Polyhedral Complaton Foundatons Lous-Noël Pouchet pouchet@cse.oho-state.edu Dept. of Computer Scence and Engneerng, the Oho State Unversty Feb 8, 200 888., Class # Introducton: Polyhedral Complaton Foundatons

More information

Fast Computation of Shortest Path for Visiting Segments in the Plane

Fast Computation of Shortest Path for Visiting Segments in the Plane Send Orders for Reprnts to reprnts@benthamscence.ae 4 The Open Cybernetcs & Systemcs Journal, 04, 8, 4-9 Open Access Fast Computaton of Shortest Path for Vstng Segments n the Plane Ljuan Wang,, Bo Jang

More information

SVM-based Learning for Multiple Model Estimation

SVM-based Learning for Multiple Model Estimation SVM-based Learnng for Multple Model Estmaton Vladmr Cherkassky and Yunqan Ma Department of Electrcal and Computer Engneerng Unversty of Mnnesota Mnneapols, MN 55455 {cherkass,myq}@ece.umn.edu Abstract:

More information

Fast Sparse Gaussian Processes Learning for Man-Made Structure Classification

Fast Sparse Gaussian Processes Learning for Man-Made Structure Classification Fast Sparse Gaussan Processes Learnng for Man-Made Structure Classfcaton Hang Zhou Insttute for Vson Systems Engneerng, Dept Elec. & Comp. Syst. Eng. PO Box 35, Monash Unversty, Clayton, VIC 3800, Australa

More information

GSLM Operations Research II Fall 13/14

GSLM Operations Research II Fall 13/14 GSLM 58 Operatons Research II Fall /4 6. Separable Programmng Consder a general NLP mn f(x) s.t. g j (x) b j j =. m. Defnton 6.. The NLP s a separable program f ts objectve functon and all constrants are

More information

Classification of Face Images Based on Gender using Dimensionality Reduction Techniques and SVM

Classification of Face Images Based on Gender using Dimensionality Reduction Techniques and SVM Classfcaton of Face Images Based on Gender usng Dmensonalty Reducton Technques and SVM Fahm Mannan 260 266 294 School of Computer Scence McGll Unversty Abstract Ths report presents gender classfcaton based

More information

Efficient Text Classification by Weighted Proximal SVM *

Efficient Text Classification by Weighted Proximal SVM * Effcent ext Classfcaton by Weghted Proxmal SVM * Dong Zhuang 1, Benyu Zhang, Qang Yang 3, Jun Yan 4, Zheng Chen, Yng Chen 1 1 Computer Scence and Engneerng, Bejng Insttute of echnology, Bejng 100081, Chna

More information

Taxonomy of Large Margin Principle Algorithms for Ordinal Regression Problems

Taxonomy of Large Margin Principle Algorithms for Ordinal Regression Problems Taxonomy of Large Margn Prncple Algorthms for Ordnal Regresson Problems Amnon Shashua Computer Scence Department Stanford Unversty Stanford, CA 94305 emal: shashua@cs.stanford.edu Anat Levn School of Computer

More information

Collaboratively Regularized Nearest Points for Set Based Recognition

Collaboratively Regularized Nearest Points for Set Based Recognition Academc Center for Computng and Meda Studes, Kyoto Unversty Collaboratvely Regularzed Nearest Ponts for Set Based Recognton Yang Wu, Mchhko Mnoh, Masayuk Mukunok Kyoto Unversty 9/1/013 BMVC 013 @ Brstol,

More information

EYE CENTER LOCALIZATION ON A FACIAL IMAGE BASED ON MULTI-BLOCK LOCAL BINARY PATTERNS

EYE CENTER LOCALIZATION ON A FACIAL IMAGE BASED ON MULTI-BLOCK LOCAL BINARY PATTERNS P.G. Demdov Yaroslavl State Unversty Anatoly Ntn, Vladmr Khryashchev, Olga Stepanova, Igor Kostern EYE CENTER LOCALIZATION ON A FACIAL IMAGE BASED ON MULTI-BLOCK LOCAL BINARY PATTERNS Yaroslavl, 2015 Eye

More information

Proper Choice of Data Used for the Estimation of Datum Transformation Parameters

Proper Choice of Data Used for the Estimation of Datum Transformation Parameters Proper Choce of Data Used for the Estmaton of Datum Transformaton Parameters Hakan S. KUTOGLU, Turkey Key words: Coordnate systems; transformaton; estmaton, relablty. SUMMARY Advances n technologes and

More information

Towards Semantic Knowledge Propagation from Text to Web Images

Towards Semantic Knowledge Propagation from Text to Web Images Guoun Q (Unversty of Illnos at Urbana-Champagn) Charu C. Aggarwal (IBM T. J. Watson Research Center) Thomas Huang (Unversty of Illnos at Urbana-Champagn) Towards Semantc Knowledge Propagaton from Text

More information

U.C. Berkeley CS294: Beyond Worst-Case Analysis Handout 5 Luca Trevisan September 7, 2017

U.C. Berkeley CS294: Beyond Worst-Case Analysis Handout 5 Luca Trevisan September 7, 2017 U.C. Bereley CS294: Beyond Worst-Case Analyss Handout 5 Luca Trevsan September 7, 207 Scrbed by Haars Khan Last modfed 0/3/207 Lecture 5 In whch we study the SDP relaxaton of Max Cut n random graphs. Quc

More information

An Application of the Dulmage-Mendelsohn Decomposition to Sparse Null Space Bases of Full Row Rank Matrices

An Application of the Dulmage-Mendelsohn Decomposition to Sparse Null Space Bases of Full Row Rank Matrices Internatonal Mathematcal Forum, Vol 7, 2012, no 52, 2549-2554 An Applcaton of the Dulmage-Mendelsohn Decomposton to Sparse Null Space Bases of Full Row Rank Matrces Mostafa Khorramzadeh Department of Mathematcal

More information

On Some Entertaining Applications of the Concept of Set in Computer Science Course

On Some Entertaining Applications of the Concept of Set in Computer Science Course On Some Entertanng Applcatons of the Concept of Set n Computer Scence Course Krasmr Yordzhev *, Hrstna Kostadnova ** * Assocate Professor Krasmr Yordzhev, Ph.D., Faculty of Mathematcs and Natural Scences,

More information

SUMMARY... I TABLE OF CONTENTS...II INTRODUCTION...

SUMMARY... I TABLE OF CONTENTS...II INTRODUCTION... Summary A follow-the-leader robot system s mplemented usng Dscrete-Event Supervsory Control methods. The system conssts of three robots, a leader and two followers. The dea s to get the two followers to

More information

Skew Angle Estimation and Correction of Hand Written, Textual and Large areas of Non-Textual Document Images: A Novel Approach

Skew Angle Estimation and Correction of Hand Written, Textual and Large areas of Non-Textual Document Images: A Novel Approach Angle Estmaton and Correcton of Hand Wrtten, Textual and Large areas of Non-Textual Document Images: A Novel Approach D.R.Ramesh Babu Pyush M Kumat Mahesh D Dhannawat PES Insttute of Technology Research

More information

Mercer Kernels for Object Recognition with Local Features

Mercer Kernels for Object Recognition with Local Features TR004-50, October 004, Department of Computer Scence, Dartmouth College Mercer Kernels for Object Recognton wth Local Features Swe Lyu Department of Computer Scence Dartmouth College Hanover NH 03755 A

More information

For instance, ; the five basic number-sets are increasingly more n A B & B A A = B (1)

For instance, ; the five basic number-sets are increasingly more n A B & B A A = B (1) Secton 1.2 Subsets and the Boolean operatons on sets If every element of the set A s an element of the set B, we say that A s a subset of B, or that A s contaned n B, or that B contans A, and we wrte A

More information

The Greedy Method. Outline and Reading. Change Money Problem. Greedy Algorithms. Applications of the Greedy Strategy. The Greedy Method Technique

The Greedy Method. Outline and Reading. Change Money Problem. Greedy Algorithms. Applications of the Greedy Strategy. The Greedy Method Technique //00 :0 AM Outlne and Readng The Greedy Method The Greedy Method Technque (secton.) Fractonal Knapsack Problem (secton..) Task Schedulng (secton..) Mnmum Spannng Trees (secton.) Change Money Problem Greedy

More information

Discriminative classifiers for object classification. Last time

Discriminative classifiers for object classification. Last time Dscrmnatve classfers for object classfcaton Thursday, Nov 12 Krsten Grauman UT Austn Last tme Supervsed classfcaton Loss and rsk, kbayes rule Skn color detecton example Sldng ndo detecton Classfers, boostng

More information

A Unified Framework for Semantics and Feature Based Relevance Feedback in Image Retrieval Systems

A Unified Framework for Semantics and Feature Based Relevance Feedback in Image Retrieval Systems A Unfed Framework for Semantcs and Feature Based Relevance Feedback n Image Retreval Systems Ye Lu *, Chunhu Hu 2, Xngquan Zhu 3*, HongJang Zhang 2, Qang Yang * School of Computng Scence Smon Fraser Unversty

More information

A new segmentation algorithm for medical volume image based on K-means clustering

A new segmentation algorithm for medical volume image based on K-means clustering Avalable onlne www.jocpr.com Journal of Chemcal and harmaceutcal Research, 2013, 5(12):113-117 Research Artcle ISSN : 0975-7384 CODEN(USA) : JCRC5 A new segmentaton algorthm for medcal volume mage based

More information

Research of Neural Network Classifier Based on FCM and PSO for Breast Cancer Classification

Research of Neural Network Classifier Based on FCM and PSO for Breast Cancer Classification Research of Neural Network Classfer Based on FCM and PSO for Breast Cancer Classfcaton Le Zhang 1, Ln Wang 1, Xujewen Wang 2, Keke Lu 2, and Ajth Abraham 3 1 Shandong Provncal Key Laboratory of Network

More information

PERFORMANCE EVALUATION FOR SCENE MATCHING ALGORITHMS BY SVM

PERFORMANCE EVALUATION FOR SCENE MATCHING ALGORITHMS BY SVM PERFORMACE EVALUAIO FOR SCEE MACHIG ALGORIHMS BY SVM Zhaohu Yang a, b, *, Yngyng Chen a, Shaomng Zhang a a he Research Center of Remote Sensng and Geomatc, ongj Unversty, Shangha 200092, Chna - yzhac@63.com

More information

Improvement of Spatial Resolution Using BlockMatching Based Motion Estimation and Frame. Integration

Improvement of Spatial Resolution Using BlockMatching Based Motion Estimation and Frame. Integration Improvement of Spatal Resoluton Usng BlockMatchng Based Moton Estmaton and Frame Integraton Danya Suga and Takayuk Hamamoto Graduate School of Engneerng, Tokyo Unversty of Scence, 6-3-1, Nuku, Katsuska-ku,

More information

An Entropy-Based Approach to Integrated Information Needs Assessment

An Entropy-Based Approach to Integrated Information Needs Assessment Dstrbuton Statement A: Approved for publc release; dstrbuton s unlmted. An Entropy-Based Approach to ntegrated nformaton Needs Assessment June 8, 2004 Wllam J. Farrell Lockheed Martn Advanced Technology

More information

An Iterative Solution Approach to Process Plant Layout using Mixed Integer Optimisation

An Iterative Solution Approach to Process Plant Layout using Mixed Integer Optimisation 17 th European Symposum on Computer Aded Process Engneerng ESCAPE17 V. Plesu and P.S. Agach (Edtors) 2007 Elsever B.V. All rghts reserved. 1 An Iteratve Soluton Approach to Process Plant Layout usng Mxed

More information

The Research of Ellipse Parameter Fitting Algorithm of Ultrasonic Imaging Logging in the Casing Hole

The Research of Ellipse Parameter Fitting Algorithm of Ultrasonic Imaging Logging in the Casing Hole Appled Mathematcs, 04, 5, 37-3 Publshed Onlne May 04 n ScRes. http://www.scrp.org/journal/am http://dx.do.org/0.436/am.04.584 The Research of Ellpse Parameter Fttng Algorthm of Ultrasonc Imagng Loggng

More information