Prediction of Disulfide Bonding Pattern Based on Support Vector Machine with Parameters Tuned by Multiple Trajectory Search

Size: px
Start display at page:

Download "Prediction of Disulfide Bonding Pattern Based on Support Vector Machine with Parameters Tuned by Multiple Trajectory Search"

Transcription

1 Hsuan-Hung Ln, Ln-Yu seng Predcton of Dsulfde Bondng Pattern Based on Support Vector Machne wth Parameters uned by Multple rajectory Search HSUAN-HUNG LIN 1,, LIN-YU SENG 3,4 1 Department of Appled Mathematcs, Natonal Chung Hsng Unversty, achung, awan 4, ROC Department of Management Informaton Systems, Central awan Unversty of Scence and echnology, achung 461, awan, R.O.C 3 Insttute of Networkng and Multmeda, Natonal Chung Hsng Unversty, achung, awan 4, ROC 4 Department of Computer Scence and Engneerng, Natonal Chung Hsng Unversty, achung, awan 4, ROC shln@ctust.edu.tw, lytseng@cs.nchu.edu.tw Abstract: - he predcton of the locaton of dsulfde brdges helps solvng the proten foldng problem. Most of prevous works on dsulfde connectvty pattern predcton use the pror knowledge of the bondng state of cystenes. In ths study an effectve method s proposed to predct dsulfde connectvty pattern wthout the pror knowledge of cystens bondng state. o the best of our knowledge, wthout the pror knowledge of the bondng state of cystenes, the best accuracy rate reported n the lterature for the predcton of the overall dsulfde connectvty pattern (Qp) and that of dsulfde brdge predcton (Qc) are 48% and 51% respectvely for the dataset SPX. In ths study, the cysten poston dfference, the cysten ndex dfference, the predcted secondary structure of proten and the PSSM score are used as features. he support vector machne (SVM) s traned to compute the connectvty probabltes of cystene pars. An evolutonary algorthm called the multple trajectory search (MS) s ntegrated wth the SVM tranng to tune the parameters for the SVM and the wndow szes for the predcted secondary structure and the PSSM. he maxmum weght perfect matchng algorthm s then used to fnd the dsulfde connectvty pattern. estng our method on the same dataset SPX, the accuracy rates are 54.5% and 6% for dsulfde connectvty pattern predcton and dsulfde brdge predcton when the bondng state of cystenes s not known n advance. Key-Words: - Dsulfde bondng pattern, SVM, multple trajectory search 1 Introducton Dsulfde bonds play an mportant structural role n stablzng proten conformatons. he predcton of dsulfde bondng pattern helps to a certan degree the predcton of the three-dmensonal proten structure and hence ts functon because dsulfde bonds mpose geometrcal constrants on the proten backbones. Some recent research works do have shown the close relaton between the dsulfde bondng patterns and the proten structures [1, ]. As shown n Fg. 1, the structures of (a) the tck antcoagulant peptde (1AP), a protease nhbtor, and (b) the bovne pancreatc trypsn nhbtor (1QLQ), a serne protease nhbtor. her sequence dentty s only 18.%, the absence of sgnfcant sequence dentty between 1AP and 1QLQ was noted by Antuch et al. [3] and PSI-BLAS searches n the SwssProt/rEMBL and NR database were unsuccessful n dentfyng the smlarty between these two protens. In dsulfde-bondng base classfcaton, these two protens have the same dsulfde-bondng connectvty pattern (1-6, -3, 4-5), and all of whch are classfed n the BPI-lke superfamly n SCOP [4]. In the realm of the dsulfde bond predcton, two problems are addressed. he frst s the predcton of the dsulfde bondng states and the second s the predcton of the dsulfde bondng pattern. Recently, sgnfcant progress has been made n the predcton of the dsulfde bondng states. Several methods based on statstcal analyss [5], neural networks [6, 7], or support vector machnes [8] had been proposed. hey are qute effectve n predctng the bondng state of cystenes wth the accuracy rates around 81%-9%. ISSN: Issue 9, Volume 8, September 9

2 Hsuan-Hung Ln, Ln-Yu seng (a) (b) Fg. 1 he structures of (a) the tck antcoagulant peptde (1AP), a protease nhbtor, and (b) the bovne pancreatc trypsn nhbtor (1QLQ), a serne protease nhbtor. Recently, several methods were proposed for the predcton of the dsulfde bondng pattern. he frst method was presented by Farsell and Casado [9]. hey reduced dsulfde connectvty to the graph matchng problem n whch vertces are oxdzed cystenes and edges are labeled by the strength of nteracton (contact potental) n the assocated par of cystenes. he Monte Carlo smulated annealng method s used to fnd the optmal values of contact potentals and fnally the dsulfde brdges are located by fndng the maxmum weght perfect matchng. Farsell et al. [1] mproved ther prevous results by usng neural networks to predct the cystene parwse nteractons. Vullo and Frascon [11] developed an ad-hoc recursve neural network for scorng labeled undrected graphs that represent the connectvty pattern and they mproved the accuracy rate of bondng pattern predcton sgnfcantly from 34% to 44%. Cheng et al. [1] mproved the predcton accuracy by usng twodmensonal recursve neural networks to predct connectvty probabltes between cystene pars. Ferrè and Clote [13] desgned the dresdue neural network to predct connectvty probabltes between cystene pars. hey also used secondary structure nformaton and dresdue frequences n ther tranng. sa et al. [14] used the support vector machne to predct connectvty probabltes between cystene pars. he features used n tranng the support vector machne are local sequence profles and the lnear dstance of cystenes. All above mentoned methods are based on the reducton of the connectvty pattern predcton to the maxmum weght perfect matchng problem. he followng four methods are not based on ths reducton. Chen and Hwang [15] used the support vector machne to predct the bondng pattern drectly. he features they used n tranng the support vector machne are the couplng between the local sequence envronments of cystene pars, the cystene separatons, and the amno acd content. Zhao et al. [16] used a smple feature called cystene separatons profles (CSP) to predct the connectvty patterns. Chen et al. [17] proposed a two-level model. Lu et al. [18] obtaned the accuracy of 73.9% by usng GA to optmze feature selecton for the SVM. Song et al. [19] obtaned the accuracy of 74.4% by usng multple sequence vectors and secondary structure. hs accuracy rate s the best one found n the lterature. Rubnsten et al. [] analyzes the correlated mutaton patterns n multple sequence algnments to predct the dsulfde bond connectvty. All these methods except Cheng et al. [1] and Ferrè et al. [13] assume that the bondng states are known. he method proposed by Cheng et al. [1] and Ferrè et al. [13] can be appled whether the bondng states are known or not. 1.1 Support vector machne Support vector machne (SVM) s a supervsed learnng method used for classfcaton [1]. It s beleved to be superor to tradtonal statstcal and neural network classfers. However, t s crtcal to determne sutable combnaton of SVM parameters regardng classfcaton performance. A specal property of SVM s that t can smultaneously mnmze the emprcal classfcaton error and maxmze the geometrc margn. It s a useful technque for data classfcaton and regresson and has become an mportant tool for machne learnng and data mnng. It s a powerful methodology for ISSN: Issue 9, Volume 8, September 9

3 Hsuan-Hung Ln, Ln-Yu seng solvng problems n nonlnear classfcaton, functon estmaton and densty estmaton whch has also led to many applcatons, such as the mage nterpretaton, data mnng and other botechnologcal felds [-4]. SVM s generally used for data whch can be classfed nto two clusters, however, classfcaton of multple clusters can also be easly expanded [5]. In general, SVM has better performance when competed wth exstng methods, such as neural networks and decson trees [6-8]. Recently, applcaton of SVM n medcne has grown rapdly. For examples, t has been appled n predcton of RNA-bndng stes n protens [9], proten secondary structure predcton [3]. SVM has also been appled to bologcal problems [31, 3], electrcal energy consumpton forecastng [33], remote sensng mage classfcaton [34], and pattern recognton and data classfcaton [35]. he goal of support vector machne (SVM) s to separate multple clusters by constructng a set of separatng hyperplanes wth greatest margn to the boundary of each cluster. For a two-class classfcaton example, let us vew the nput data as n n-dmensonal space. SVM wll construct the hyperplane (Eq. 1) n the space to separate two classes that leaves the maxmum margns from both classes. [36, 37] g( x ) w x w (1) he dstance of a pont from a hyperplane s gven by g( ) z x w () As shown n Fg., the values of w and w n Eq. (1) are scaled so that the values of g(x) at the nearest ponts n class 1 and class equal to 1 and -1 respectvely. herefore, fndng the hyperplane becomes a nonlnear quadratc optmzaton problem, whch can be formulated as: w Mnmze J ( w) Subject to y ( w x w ) 1, 1,,..., N (3) he above mnmzer must satsfy Karush- Kuhn-ucker (KK) condton, and t can be solved by consderng Lagrangan dualty. he problem can be stated equvalently by ts Wolfe dual representaton form: Maxmze Subject to w w L( w, w, λ) w N y x, 1 where L(w, w, λ) s the Lagrangan functon and λs the vector of Lagrangan multplers. By comparng Eqs. (3) and (4), t s noted that the frst two constrants n Eq. (4) become equalty constrants and ths makes the problem easer to be solved. After a lttle bt algebra manpulaton, Eq. (4) becomes N 1 max ( ) 1 j y y jx x j λ, j (5) N Subject to y wth λ 1 As soon as the Lagrangan multplers are obtaned by maxmzng the above equaton, the optmal hyperplane can be obtaned from w N y 1 x n Eq. (4). N N 1 [ y ( w 1 y, x w ) -1] and λ (4) Once the optmal hyperplane s obtaned, classfcaton of a sample s performed based on the sgn of the followng equaton: Fg.. he hyperplane wth greatest margn. Note that the margn of drecton s larger than the margn of drecton 1 Ns g( x) w x w y x x w (6) 1 where Ns s the number of support vectors. For a vector x R l n the orgnal feature space, assume that there exsts a mappng from x R l to y = (x) R k, where k s usually much hgher than l. hen t s always true that r ( x) ( K ( x, (7) r r where r (x) s the r th component of the mappng and the kernel functon K(x, s a symmetrc functon satsfyng the followng condton ISSN: Issue 9, Volume 8, September 9

4 Hsuan-Hung Ln, Ln-Yu seng K( x, g( dxdz, and g( x) dx (8) For a nonlnear classfer, varous kernels, ncludng polynomal, radal bass functon, and hyperbolc tangent, as shown n Eq. (9) can be used for mappng the orgnal sample space nto a new Eucldan space n whch Mercer s condtons are satsfed. he lnear classfer can then be desgned for classfcaton. q K( x, ( x z 1), q K ( x, exp( x z K ( x, tanh( x z ) / ) (9a) (9b) (9c) n-fold cross-valdaton of the SVM model s acheved by dvdng the dataset nto n folds. When some fold s reserved for testng, the other n-1 folds are used for tranng the model. 1. Multple rajectory Search he multple trajectory search (MS) had been presented for large scale global optmzaton [38]. he MS had also been used to solve the multobjectve optmzaton problems and obtaned satsfactory results [39]. It uses multple agents to search the soluton space concurrently. Each agent does an terated local search usng one of three canddate local search methods. By choosng a local search method that best fts the landscape of a soluton s neghborhood, an agent may fnd ts way to a local optmum or the global optmum Orthogonal Array and Smulated Orthogonal Array he concept of orthogonal arrays whch are used n expermental desgn methods s brefly ntroduced. Suppose n an experment, there are k factors and each factor has q levels. In order to fnd the best setng of each factor s level, qkexperments must be done. Very often, t s not possble or cost effectve to test all qk combnatons. It s desrable to sample a small but representatve sample of combnatons for testng. he orthogonal arrays were developed for ths purpose. In an experment that has k factors and each factor has q levels, an orthogonal array OA(n,k,q,t) s an array wth n rows and k columns whch s a representatve sample of n testng experments that satsfes the followng three condtons. (1) For the factor n any column, every level occurs the same number of tmes. () For the t factors n any t columns, every combnaton of q levels occurs the same number of tmes. (3) he selected combnatons are unformly dstrbuted over the whole space of all the possble combnatons. In the notaton OA(n,k,q,t), n s the number of experments, k s the number of factors, q s the number of levels of each factor and t s called the strength. he orthogonal arrays exst for only some specfcn s andk s. So t s not approprate to use the OA n some applcatons. seng et al. [38] proposed the smulated OA (SOA). he SOA satsfes only the frst of the above mentoned three condtons, but t s easy to construct an SOA of almost any sze. Suppose there are k factors and each factor has q levels, an m k smulated orthogonal array SOA m k wth m beng a multple of q can be generated as follows. For each column of SOA m k, a random permutaton of,1,, q-1 s generated and denoted as sequence C. hen the elements n C are pcked one by one sequentally and flled n a randomly chosen empty entry of the column. If all elements n C were pcked, the process pcks elements agan from the begnnng of C. So n every column of SOA m k, each of q elements wll appear the same number of tmes (condton 1). 1.. he Multple rajectory Search he MS generates M ntal solutons by utlzng the smulated orthogonal array SOA M N, where the number of factors corresponds to the dmenson N and the number of levels of each factor s taken to be M. So eachof, 1,, M-1 wll appear once n every column. Usng SOA tends to make these M ntal solutons unformly dstrbuted over the feasble soluton space. he ntal search range for local search methods s set to half of the dfference between the upper bound and the lower bound. Afterwards, local search methods wll change the search range. he MS conssts of teratons of local searches untl the maxmum number of functon evaluatons s reached. In the frst teraton, the MS conducts local searches on all of M ntal solutons. But n the followng teratons, only some better solutons are chosen as foreground solutons and the MS conducts local searches on these solutons. hree local search methods are provded for the MS. he MS wll frst test the performance of three local search methods and then choose the one that performs best, that s, the one that best fts the landscape of the neghborhood of the soluton, to do the search. After conductng the search on foreground solutons, the MS apples Local Search 1 to the current best soluton tryng to mprove the current best soluton. Before the end of an teraton, some better solutons are chosen as the foreground ISSN: Issue 9, Volume 8, September 9

5 Hsuan-Hung Ln, Ln-Yu seng solutons for the next teraton. he multple trajectory search algorthm s descrbed n the followng. Multple rajectory Search /*Generate M ntal solutons */ Buld smulated orthogonal array SOA M N For = 1 to M For j = 1 to N X [j] = l + (u -l ) * SOA[, j] / (M-1) Evaluate functon values of X s For = 1 to M Enable[] RUE Improve[] RUE SearchRangeX = (UPPER_BOUND-LOWER_BOUND)/ Whle ( #ofevaluaton predefned_max_evaluaton) For I =1 to M If Enable[] = RUE hen GradeX LS1_estGrade LS_estGrade LS3_estGrade For j = 1 to #oflocalsearchest LS1_estGrade LS1_estGrade+ LocalSearch1(X, SearchRangeX ) LS_estGrade LS_estGrade+ LocalSearch(X, SearchRange X ) LS3_estGrade LS3_estGrade+ LocalSearch3(X, SearchRange X ) Choose the one wth the best estgrade and let t be LocalSearchK /* K may be 1,, or 3 */ For j = 1 to #oflocalsearch GradeX GradeX + LocalSearchK(X, SearchRangeX ) For I = 1 to #oflocalsearchbest LocalSearch1(BestSoluton,SearchRangeBestSoluton) For = 1 to M Enable[] FALSE Choose #offoreground X s whose GradeX are best among the M solutons and set ther correspondng Enable[] to RUE End Whle In the MS, three local search methods are used for searchng dfferent landscape of the neghborhood of a soluton. Local Search 1 searches along one dmenson from the frst dmenson to the last dmenson. Local Search s smlar to Local Search 1 except that t searches along a drecton derved from about one-fourth of dmensons. In both local search methods, the search range (SR) wll be cut to one-half untl t s less than f the prevous local search does not make mprovement. In Local Search 1, on the dmenson concernng the search, the soluton s coordnate of ths dmenson s frst subtracted by SR to see f the objectve functon value s mproved. If t s, the search proceeds to consder the next dmenson. If t s not, the soluton s restored and then the soluton s coordnate of ths dmenson s added by.5*sr, agan to see f the objectve functon value s mproved. If t s, the search proceeds to consder the next dmenson. If t s not, the soluton s restored and the search proceeds to consder the next dmenson. Local Search 1 and Local Search are lsted n the followng. Functon LocalSearch1(X k, SR) If Improve[k]=FALSE hen SR = SR / If SR < 1e-15 hen SR (UPPER_BOUND-LOWER_BOUND) *.4 Improve[k] FALSE For = 1 to N X k[] X k[]- SR If X k s better than current best soluton hen grade grade + BONUS1 Update current best soluton If functon value of X k s the same hen restore X k to ts orgnal value If functon value of X k degenerates hen restore X k to ts orgnal value X k[]_x k[] +.5* SR If X k s better than current best soluton hen grade grade + BONUS1 Update current best soluton If functon value of X k has not been mproved hen restore X k to ts orgnal value grade grade + BONUS Improve[k] RUE grade grade + BONUS Improve[k] RUE return grade ISSN: Issue 9, Volume 8, September 9

6 Hsuan-Hung Ln, Ln-Yu seng Functon LocalSearch(X k, SR) If Improve[k] = FALSE hen SR = SR/ If SR < 1e-15 hen SR (UPPER_BOUND-LOWER_BOUND) *.4 Improve[k] FALSE For l = 1 to N For = 1 to N r[] Random{,1,,3} D[] Random{-1,1} For = 1 to N If r[] = hen X k[] X k[] SR * D[] If X k s better than current best soluton hen grade grade + BONUS1 Update current best soluton If functon value of X k s the same hen restore X k to ts orgnal value If functon value of X k degenerates hen restore X k to ts orgnal value For = to N If r[]= hen X k[]_x k[] +.5 * SR * D[] If X k s better than current best soluton hen grade grade + BONUS1 Update current best soluton If functon value of X k has not been mproved hen restore X k to ts orgnal value grade grade + BONUS Improve[k] RUE grade grade + BONUS Improve[k] RUE return grade Local Search 3 s dfferent from Local Search 1 and Local Search. Local Search 3 consders three small movements along each dmenson and heurstcally determnes the movement of the soluton along each dmenson. In Local Search 3, although the search s along each dmenson from the frst dmenson to the last dmenson, the evaluaton of the objectve functon value s done after searchng all the dmensons, and the soluton wll be moved to the new poston only f the objectve functon has been mproved at ths evaluaton. Local Search 3 s descrbed n the followng. Functon LocalSearch3(X,SR) For = 1 to N X 1 X s th coordnate s ncreased by.1 Y 1 X s th coordnate s decreased by.1 X X s th coordnate s ncreased by. If X 1 s better than current best soluton hen grade grade + BONUS1 Update current best soluton If Y 1 s better than current best soluton hen grade_grade + BONUS1 Update current best soluton If X s better than current best soluton hen grade grade + BONUS1 Update current best soluton D 1 = F(X) - F(X 1) If D 1 > // X 1 s better than X hen grade grade + BONUS D = F(X) - F(Y 1) If D > //Y 1 s better than X hen grade grade + BONUS D 3 = F(X) - F(X ) If D 3 > //X s better than X hen grade grade + BONUS a Random[.4,.5] b Random[.1,.3] c Random[, 1] X[] = X[] + a(d 1 - D ) + b(d 3-D 1) + c If functon value of X has not been mproved hen restore X to ts orgnal value grade grade + BONUS return grade Materals and Methods.1 Dataset In order to compare the predcton accuracy rates wth prevously reported method by Cheng et al. [1], the same dataset SPX used by them was employed n the experment. In SPX, all protens were extracted from the PDB on May 17, 4 that contan at least one ntrachan dsulfde bond and all ISSN: Issue 9, Volume 8, September 9

7 Hsuan-Hung Ln, Ln-Yu seng protens that contan less than 1 amno acds were removed. Furthermore, to reduce overrepresentaton of partcular proten famles, Cheng et al. used the UnqueProt, a proten redundancy reducton tool based on the HSSP dstance [4], to choose 118 protens by settng the HSSP cut-off dstance to 1. In SPX, the proten sequences were randomly dvded nto 1 subsets wth roughly equal sze for 1-fold cross-valdaton experment.. Methodology Cheng et al. [1] predcted the bondng state of cystenes frst, and then they predcted the dsulfde bond pattern for the predcted oxdzed cystenes. Our method, nstead of predctng bondng state frst, drectly predcts the bondng probablty of all pars of cystenes. Our method uses the cysten poston dfference, the cysten ndex dfference, the predcted secondary structure of the proten and the PSSM score as features. he SVM s traned to compute the connectvty probabltes of all the cystene pars. he MS [38] s used to evolve the parameters C and γfor SVM and wndow szes for the predcted secondary structure and the PSSM. he maxmum weght perfect matchng algorthm s then used to fnd the dsulfde connectvty pattern wthout the pror knowledge of the bondng state of cystenes..3 Features (1) NCPD (Normalzed Cystene Poston Dfference): Let {c 1,c,,c n } be the postons of the cystenes n ascendng order. he normalzed cysten poston dfference between c and c j s defned as c -c j /(c n -c 1 ). () NCID (Normalzed Cystene Index Dfference): Let {c 1,c,,c n } be the postons of the cystenes n ascendng order. he normalzed cysten orderng ndex between c and c j s defned as -j /n. (3) PSSM: PSI-BLAS [41] s used to obtan the local sequence profles. he output fle contans four parts. he frst part s the poston-specfc scorng matrx (PSSM). he PSSM score s used as one of features. (4) PSS (Predcted Secondary Structure): he predcted secondary structure obtaned by applyng Jones predcton method [4] s used as a feature. In practce, the PSIPRED program s used to predct the secondary structure nformaton..4 Constructon of Predcton Model Based on SVM It s noted that SVM s superor to tradtonal statstcal and neural network classfers n many applcatons. However, t s crtcal to determne proper combnaton of SVM parameters (C and γ) n order to acheve good classfcaton performance. he SVM mplementaton used n ths work s LIBSVM [43]. Snce the predcton rate s hghly nfluenced by the value of the parameters C andγ, the multple trajectory search [38] s used for fndng good settngs of parameter values for the SVM and the wndow szes for the PSS and the PSSM..5 Multple rajectory Search for Selectng SVM Parameters and Wndow Szes he multple trajectory search can fnd optmal or near-optmal soluton wthn an acceptable tme, and s faster than the dynamc programmng or the branch-and-bound strategy. Prevously, some research works appled the evolutonary algorthms to select features n the frst phase and then used the selected features to tran the SVM n the second phase. In ths study, the MS and the SVM tranng are tghtly ntegrated. Snce the values of parameters C and γfor the SVM are crtcal to classfcaton accuracy of the SVM, selectng proper values of C and γbecomes an mportant task. radtonally, the regular grd search strategy was used to perform the parameter value selecton. However, t s very tmeconsumng. In ths work, the MS s ntegrated wth the SVM tranng to select not only the value of parameters C andγbut also the wndow szes of the PSS and the PSSM. As shown n Fg. 3, a chromosome s coded as S =(C, γ, S 1, S ) where C andγ are the log values of the parameters C andγ, and S 1, S are the wndow szes for the PSS and the PSSM respectvely. he ftness functon s defned as the accuracy of the SVM on dsulfde connectvty predcton. he flowchart of the ntegraton of the MS and SVM tranng s shown n Fg. 4. log C log γ S 1 S Wndow sze for the PSS SVM Parameters and the PSSM Fg. 3. Chromosome of the proposed method. ISSN: Issue 9, Volume 8, September 9

8 Hsuan-Hung Ln, Ln-Yu seng Fg. 4. Flowchart of the proposed method..6 Maxmum weght perfect matchng When a test proten s gven, the connectvty probablty of each cystene par wll be computed by the traned SVM. A complete graph s then constructed wth all cystenes as nodes and the weght assocated wth each edge s the dsulfde connectvty probablty of the par of cystenes that are ncdent to ths edge. he Gabow s algorthm [44] s then appled to fnd the maxmum weght perfect matchng. Because the Gabow s algorthm can only be appled wth nteger edge labels, the dsulfde connectvty probablty of the par of cystenes s multpled by 1 and then truncated nto an nteger to represent the weght assocated wth the edge. hs matchng represents the predcton result of the dsulfde connectvty pattern. We set a probablty threshold, when the dsulfde connectvty probablty of two cystenes s greater than the probablty threshold, then there s a bond between the two cystenes, otherwse there s no bond between the two cystenes. For dataset SPX wth 1-fold cross-valdaton, the results of parameter value selecton by the proposed method are as follows: log C = 7.4 and log γ= -4.6 for SVM, the wndow szes are 1 and 3 for the PSS and the PSSM. Moreover, the probablty threshold s set to. for a bond to be exsted between two cystens. hs value s determned emprcally. 3 Experment Results In order to evaluate the performance of the predcton, two accuracy ndces Q P and Q C are used: Q Q P C C P P C c c where C P s the number of protens whose bondng patterns are correctly predcted; P s the total number of protens n the test set; C C s number of dsulfde brdges that are correctly predcted and C s the total number of dsulfde brdges n test protens. ISSN: Issue 9, Volume 8, September 9

9 Hsuan-Hung Ln, Ln-Yu seng ables 1 and show the results for brdge classfcaton and the predcton of the dsulfde bondng pattern of the dataset SPX. For brdge classfcaton, Cheng et al. [1] lsted senstvty and specfcty nstead of accuracy. From the defnton of senstvty and specfcty, n general the value of accuracy les between them. From able 1, t s noted that the overall senstvty s 5% for Cheng s method and the predcton accuracy (Qc) s 6% for our method. here s an ncrease of predcton accuracy from 5% to 6%. As for the dsulfde connectvty predcton, Cheng s method wth true secondary structure (SS) and solvent accessblty (SA) nformaton n the nputs has the accuracy rate 51%. And Cheng s method wth predcted secondary structure (PSS) and predcted solvent accessblty (PSA) nformaton n the nputs has the predcton accuracy 48% only. In ths study, usng predcted secondary structure n the nputs, the predcton accuracy s 54.5%. here s an ncrease of predcton accuracy from 48% to 54.5%. able 1. Brdge classfcaton result for dataset SPX wthout the pror knowledge of the bondng state of cystenes. # of bonds Cheng et al. (6) hs work Senstvty Specfcty Accuracy(Q c ) 1 71% 48% 7.9% 59% 6% 73.% 3 55% 61% 69.7% 4 44% 48% 59.7% 5 3% 35% 4.% 6 3% 36% 33.3% 7 9% 3% 7.1% 8 % %.5% 9 44% 5% 44.4% 1 33% 36% 3.3% 1 38% 39% 58.3% 14 79% 85% 1.% 16 13% 13% 15.6% 17 53% 6% 55.9% 5 3% 53%.% 6 31% 51% 3.8% All 5% 51% 6.% able. Dsulfde connectvty predcton result for dataset SPX wthout the pror knowledge of the bondng state of cystenes. # of bonds Cheng et al. (6) Q p wth SS and SA Q p wth PSS and PSA hs work Accuracy(Q p) 1 59% 59% 6.6% 59% 56% 65.9% 3 5% 47% 59.8% 4 34% % 36.4% 5~6 % 13% 11.8% All 51% 48% 54.5% 4 Concluson Recently, the progress n the predcton of the oxdaton states of cystenes n proten sequence s sgnfcant. But for the predcton of the bondng pattern of cystenes, much research effort s stll needed to mprove the predcton accuracy. o these authors knowledge, all prevous approaches except those presented by Ferrè et al. [13] and Cheng et al. [1] assume that the oxdaton states of cystenes were known n advance. o practcally solve the predcton of the bondng pattern of cystenes, ths assumpton eventually should be removed. In ths work, wthout the pror knowledge of the oxdaton states of cystenes, by ntegratng the MS and the SVM tranng to tune parameters of SVM and the wndow szes of the PSS and the PSSM, the proposed method acheves the accuracy of 54.5% on the bondng pattern predcton, whch mproves the accuracy of 51% obtaned by Cheng et al. [1]. References. [1] C. C. Chuang, C. Y. Chen, J. M. Yang, P. C. Lyu, J. K. Hwang, Relatonshp between proten structures and dsulfde-bondng patterns, Protens, Vol. 55, 3, pp.1-5. [] H. W.. van Vljmen, A. Gupta, L. S. Narasmhan, J. Sngh, A novel database of dsulfde patterns and ts applcaton to the dscovery of dstantly related homologs, J. Mol. Bol., Vol. 335, 4, pp [3] W. Antuch, P. Guntert, M. Blleter,. Hawthorne, H. Grossenbacher, K. Wuthrch, NMR soluton structure of the recombnant tck antcoagulant proten (rap), a factor Xa nhbtor from the tck Ornthodoros moubata, FEBS Letters, Vol. 35, 1994, pp ISSN: Issue 9, Volume 8, September 9

10 Hsuan-Hung Ln, Ln-Yu seng [4] L. C. Loredana, E. B Steven, J. P. H. m, C. Chotha, A. G. Murzn, SCOP database n : refnements accommodate structural genomcs, Nuclec Acds Res. Vol. 3,, pp [5] A. Fser, I. Smon, Predctng the oxdaton state of cystenes by multple sequence algnment, Bonformatcs, Vol. 16,, pp [6] P. Farsell, P. Rccobell, R. Casado, Role of evolutonary nformaton n predctng the dsulfde-bondng state of cystene n protens, Protens, Vol. 36, 1999, pp [7] P. L. Martell, P. Farsell, L. Malagut, R. Casado, Predcton of the dsulfde-bondng state of cystenes n protens wth hdden neural networks, Proten Engneerng, Vol. 15,, pp [8] Y. C. Chen, Y. S. Ln, C. J. Ln, J. K. Hwang, Predcton of the bondng states of cystenes usng the support vector machnes based on multple feature vectors and cystene state sequences, Protens, Vol.55, 4, pp [9] P. Farsell, R. Casado, Predcton of dsulfde connectvty n protens, Bonformatcs, Vol. 17, 1, pp [1] P. Farsell, P. L. Martell, R. Casado, A neural network base method for predctng the dsulfde connectvty n protens, In Daman,E. et al., eds. Knowledge based Intellgent Informaton Engneerng Systems and Alled echnologes KES, IOS Press, Amsterdam, 1, pp [11] A. Vullo, P. Frascon, Dsulfde connectvty predcton usng recursve neural networks and evolutonary nformaton, Bonformatcs, Vol., 4, pp [1] J. Cheng, H. Sago, P. Bald, Large-scale predcton of dsulphde brdges usng kernel methods, two-dmensonal recursve neural networks, and weghted graph matchng, Protens, Vol. 6, 6, pp [13] F. Ferrè, P. Clote, Dsulfde connectvty predcton usng secondary structure nformaton and dresdue frequences, Bonformatcs, Vol. 1, 5, pp [14] C. H. sa, B. J. Chen, H. H. Chan, H. L. Lu, C.Y. Kao, Improvng dsulfde connectvty predcton wth sequental dstance between oxdzed cystenes, Bonformatcs, Vol. 1, 5, pp [15] Y. C. Chen, J. K. Hwang, Predcton of dsulfde connectvty from proten sequences, Protens, Vol. 61, 5, pp [16] E. Zhao, H. L. Lu, C. H. sa, H. K. sa, C. H. Chan, C. Y. Kao, Cystene separatons profles on proten sequences nfer dsulfde connectvty, Bonformatcs, Vol. 1, 5, pp [17] B. J. Chen, C. H. sa, C. H. Chan, C.Y. Kao, Dsulfde connectvty predcton wth 7% accuracy usng two-level models, Protens, Vol. 55(1), 6, pp [18] C. H. Lu, Y. C. Chen, C. S. Yu, J. K. Hwang, Predctng dsulfde connectvty patterns, Protens, Vol. 67, 7, pp [19] J. Song, Z. Yuan, H. an.. Huber. K. Burrage, Predctng dsulfde connectvty from proten sequence usng multple sequence feature vectors and secondary structure, Bonformatcs, Vol. 3, 7, pp [] R. Rubnsten, A. Fser, Predctng dsulfde bond connectvty n protens by correlated mutatons analyss, Bonformatcs, Vol. 4, 8, pp [1] C. C. Chang, C. J. Ln, LIBSVM: a lbrary for support vector machnes, 1, Retreved from [] S. Idcula-homas, A. J. Kulkarn, B. D. Kulkarn, V. K. Jayaraman, P. V. Balaj, A support vector machne-based method for predctng the propensty of a proten to be soluble or to form ncluson body on overexpresson n Eschercha col, Bonformatcs, Vol., 6, pp [3] P. M. Kasson, J. B. Huppa, M. M. Davs, A.. Brunger, A hybrd machne-learnng approach for segmentaton of proten localzaton data, Bonformatcs, Vol. 1, 5, pp [4] M. E. Mavroforaks, H. V. Georgou, N. Dmtropoulos, D. Cavouras, S. heodords, Mammographc masses characterzaton based on localzed texture and dataset fractal analyss usng lnear, neural and support vector machne classfers, Artf Intell Med, Vol. 37, 6, pp [5] C. W. Hsu, C. J. Ln, A comparson of methods for multclass support vector machnes, IEEE rans Neural Network, Vol. 13,, pp [6] M. P. S. Brown, W. N. Grundy, D. Ln, N. Crstann, C. Sugnet,. S. Furey, J. M. Ares, et al., Knowledge-based analyss of mcroarray gene expresson data usng support vector machnes, PNAS, Vol. 97,, pp [7] D. DeCoste, B. Schuolkopf, ranng nvarant support vector machnes, Machne Learnng Vol. 46,, pp ISSN: Issue 9, Volume 8, September 9

11 Hsuan-Hung Ln, Ln-Yu seng [8] Y. Lecun, L. Jackel, L. Bottou, A. Brunot, C. Cortes, J. Denker, et al., Comparson of learnng algorthms for handwrtten dgt recognton, Internatonal Conference on Artfcal Neural Networks, 1995, pp [9] J. ong, P. Jang and Z. Lu, RISP: A webbased server for predcton of RNA-bndng stes n protens, Computer Methods and Programs n Bomedcne, Vol. 9, 8, pp [3] Y. Q, F. LIN, K. K. Wong, A Networked mult-class SVM for proten secondary structure predcton, WSEAS Internatonal Conference on: Mathematcal Bology and Ecology (MABE '5), Udne, Italy, January -, 5. [31] M. Brown, W. Grundy, D. Ln, N. Crstann, C. Sugnet, M. Ares, D. Haussler, Support vector machne classfcaton of mcroarray gene expresson data, echncal Report UCSCCRL-99-9, Unversty of Calforna, Santa Cruz, [3] C. H. Yeang, S. Ramaswamy, P. amayo, S. Mukherjee, R. M. Rfkn, M. Angelo, M. Rech, E. Lander, J. Mesrov,. Golub, Molecular classfcaton of multple tumor types, Bonformatcs, Vol. 17, 1, pp.316s-3s. [33] X. P. Zhang, R. Gu, Electrcal energy consumpton forecastng based on contegraton and a support vector machne n Chna, WSEAS ransactons on Mathematcs, Issue 1, Vol. 6, 7, pp [34] H. Xao, X. Zhang, Comparson studes on classfcaton for remote sensng mage on data mnng method, WSEAS ransactons on Computers, Issue 5, Vol. 7, 8, pp [35] G. Pajares, Support vector machnes for shade dentfcaton n urban areas, WSEAS ransactons on Informaton Scence and Applcatons, Vol., No. 1, 5, pp [36] S. heodords, K. Koutroumbas, Pattern Recognton. nd edn. Academc Press, San Deago, 3. [37] N. Crstann, J. Shawe-aylor, An ntroducton to support vector machnes and other kernel-based methods, Cambrdge Unversty Press, Cambrdge UK.. [38] L. Y. seng, C. Chen, Multple trajectory search for large scale global optmzaton, Proceedngs of the 8 IEEE Congress on Evolutonary Computaton CEC 8, pp [39] L. Y. seng, C. Chen, Multple trajectory search for multobjectve optmzaton, Proceedngs of 7 IEEE Congress on Evol. Comput., 7, pp [4] C. Sander, R. Schneder, Database of homology-derved proten structures and the structural meanng of sequence algnment, Protens, Vol. 9, 1991, pp [41] S. F. Altschul,. L. Madden, A. A. Schäffer, J. Zhang, Z. Zhang, W. Mller, D. J. Lpman, Gapped BLAS and PSI-BLAS: a new generaton of proten database search programs, Nuclec Acds Res., Vol. 5, 1997, pp [4] D.. Jones, Proten secondary structure predcton based on poston-specfc scorng matrces, J. Mol. Bol., Vol. 9, 1999, pp [43] C. C. Chang, C. J. Ln, LIBSVM : a lbrary for support vector machnes, 1. [44] H. N. Gabow, Implementaton of algorthms for maxmum matchng on nonbpartte graphs, Phd hess, Stanford Unversty, CA ISSN: Issue 9, Volume 8, September 9

Disulfide Bonding Pattern Prediction Using Support Vector Machine with Parameters Tuned by Multiple Trajectory Search

Disulfide Bonding Pattern Prediction Using Support Vector Machine with Parameters Tuned by Multiple Trajectory Search Proceedngs of the 9th WSEAS Internatonal Conference on APPLIED IFORMAICS AD COMMUICAIOS (AIC '9) Dsulfde Bondng Pattern Predcton Usng Support Vector Machne wth Parameters uned by Multple rajectory Search

More information

Support Vector Machines

Support Vector Machines /9/207 MIST.6060 Busness Intellgence and Data Mnng What are Support Vector Machnes? Support Vector Machnes Support Vector Machnes (SVMs) are supervsed learnng technques that analyze data and recognze patterns.

More information

Learning the Kernel Parameters in Kernel Minimum Distance Classifier

Learning the Kernel Parameters in Kernel Minimum Distance Classifier Learnng the Kernel Parameters n Kernel Mnmum Dstance Classfer Daoqang Zhang 1,, Songcan Chen and Zh-Hua Zhou 1* 1 Natonal Laboratory for Novel Software Technology Nanjng Unversty, Nanjng 193, Chna Department

More information

The Research of Support Vector Machine in Agricultural Data Classification

The Research of Support Vector Machine in Agricultural Data Classification The Research of Support Vector Machne n Agrcultural Data Classfcaton Le Sh, Qguo Duan, Xnmng Ma, Me Weng College of Informaton and Management Scence, HeNan Agrcultural Unversty, Zhengzhou 45000 Chna Zhengzhou

More information

Support Vector Machines

Support Vector Machines Support Vector Machnes Decson surface s a hyperplane (lne n 2D) n feature space (smlar to the Perceptron) Arguably, the most mportant recent dscovery n machne learnng In a nutshell: map the data to a predetermned

More information

Classification / Regression Support Vector Machines

Classification / Regression Support Vector Machines Classfcaton / Regresson Support Vector Machnes Jeff Howbert Introducton to Machne Learnng Wnter 04 Topcs SVM classfers for lnearly separable classes SVM classfers for non-lnearly separable classes SVM

More information

Multiple Trajectory Search for Large Scale Global Optimization

Multiple Trajectory Search for Large Scale Global Optimization Multple Trajectory Search for Large Scale Global Optmzaton Ln-YuTsengandChunChen Abstract In ths paper, the multple trajectory search (MTS) s presented for large scale global optmzaton. The MTS uses multple

More information

Machine Learning 9. week

Machine Learning 9. week Machne Learnng 9. week Mappng Concept Radal Bass Functons (RBF) RBF Networks 1 Mappng It s probably the best scenaro for the classfcaton of two dataset s to separate them lnearly. As you see n the below

More information

Smoothing Spline ANOVA for variable screening

Smoothing Spline ANOVA for variable screening Smoothng Splne ANOVA for varable screenng a useful tool for metamodels tranng and mult-objectve optmzaton L. Rcco, E. Rgon, A. Turco Outlne RSM Introducton Possble couplng Test case MOO MOO wth Game Theory

More information

Support Vector Machines. CS534 - Machine Learning

Support Vector Machines. CS534 - Machine Learning Support Vector Machnes CS534 - Machne Learnng Perceptron Revsted: Lnear Separators Bnar classfcaton can be veed as the task of separatng classes n feature space: b > 0 b 0 b < 0 f() sgn( b) Lnear Separators

More information

Machine Learning. Support Vector Machines. (contains material adapted from talks by Constantin F. Aliferis & Ioannis Tsamardinos, and Martin Law)

Machine Learning. Support Vector Machines. (contains material adapted from talks by Constantin F. Aliferis & Ioannis Tsamardinos, and Martin Law) Machne Learnng Support Vector Machnes (contans materal adapted from talks by Constantn F. Alfers & Ioanns Tsamardnos, and Martn Law) Bryan Pardo, Machne Learnng: EECS 349 Fall 2014 Support Vector Machnes

More information

Edge Detection in Noisy Images Using the Support Vector Machines

Edge Detection in Noisy Images Using the Support Vector Machines Edge Detecton n Nosy Images Usng the Support Vector Machnes Hlaro Gómez-Moreno, Saturnno Maldonado-Bascón, Francsco López-Ferreras Sgnal Theory and Communcatons Department. Unversty of Alcalá Crta. Madrd-Barcelona

More information

12/2/2009. Announcements. Parametric / Non-parametric. Case-Based Reasoning. Nearest-Neighbor on Images. Nearest-Neighbor Classification

12/2/2009. Announcements. Parametric / Non-parametric. Case-Based Reasoning. Nearest-Neighbor on Images. Nearest-Neighbor Classification Introducton to Artfcal Intellgence V22.0472-001 Fall 2009 Lecture 24: Nearest-Neghbors & Support Vector Machnes Rob Fergus Dept of Computer Scence, Courant Insttute, NYU Sldes from Danel Yeung, John DeNero

More information

Using Neural Networks and Support Vector Machines in Data Mining

Using Neural Networks and Support Vector Machines in Data Mining Usng eural etworks and Support Vector Machnes n Data Mnng RICHARD A. WASIOWSKI Computer Scence Department Calforna State Unversty Domnguez Hlls Carson, CA 90747 USA Abstract: - Multvarate data analyss

More information

Cluster Analysis of Electrical Behavior

Cluster Analysis of Electrical Behavior Journal of Computer and Communcatons, 205, 3, 88-93 Publshed Onlne May 205 n ScRes. http://www.scrp.org/ournal/cc http://dx.do.org/0.4236/cc.205.350 Cluster Analyss of Electrcal Behavor Ln Lu Ln Lu, School

More information

Collaboratively Regularized Nearest Points for Set Based Recognition

Collaboratively Regularized Nearest Points for Set Based Recognition Academc Center for Computng and Meda Studes, Kyoto Unversty Collaboratvely Regularzed Nearest Ponts for Set Based Recognton Yang Wu, Mchhko Mnoh, Masayuk Mukunok Kyoto Unversty 9/1/013 BMVC 013 @ Brstol,

More information

Determining the Optimal Bandwidth Based on Multi-criterion Fusion

Determining the Optimal Bandwidth Based on Multi-criterion Fusion Proceedngs of 01 4th Internatonal Conference on Machne Learnng and Computng IPCSIT vol. 5 (01) (01) IACSIT Press, Sngapore Determnng the Optmal Bandwdth Based on Mult-crteron Fuson Ha-L Lang 1+, Xan-Mn

More information

CHAPTER 3 SEQUENTIAL MINIMAL OPTIMIZATION TRAINED SUPPORT VECTOR CLASSIFIER FOR CANCER PREDICTION

CHAPTER 3 SEQUENTIAL MINIMAL OPTIMIZATION TRAINED SUPPORT VECTOR CLASSIFIER FOR CANCER PREDICTION 48 CHAPTER 3 SEQUENTIAL MINIMAL OPTIMIZATION TRAINED SUPPORT VECTOR CLASSIFIER FOR CANCER PREDICTION 3.1 INTRODUCTION The raw mcroarray data s bascally an mage wth dfferent colors ndcatng hybrdzaton (Xue

More information

Discriminative Dictionary Learning with Pairwise Constraints

Discriminative Dictionary Learning with Pairwise Constraints Dscrmnatve Dctonary Learnng wth Parwse Constrants Humn Guo Zhuoln Jang LARRY S. DAVIS UNIVERSITY OF MARYLAND Nov. 6 th, Outlne Introducton/motvaton Dctonary Learnng Dscrmnatve Dctonary Learnng wth Parwse

More information

Outline. Discriminative classifiers for image recognition. Where in the World? A nearest neighbor recognition example 4/14/2011. CS 376 Lecture 22 1

Outline. Discriminative classifiers for image recognition. Where in the World? A nearest neighbor recognition example 4/14/2011. CS 376 Lecture 22 1 4/14/011 Outlne Dscrmnatve classfers for mage recognton Wednesday, Aprl 13 Krsten Grauman UT-Austn Last tme: wndow-based generc obect detecton basc ppelne face detecton wth boostng as case study Today:

More information

Sum of Linear and Fractional Multiobjective Programming Problem under Fuzzy Rules Constraints

Sum of Linear and Fractional Multiobjective Programming Problem under Fuzzy Rules Constraints Australan Journal of Basc and Appled Scences, 2(4): 1204-1208, 2008 ISSN 1991-8178 Sum of Lnear and Fractonal Multobjectve Programmng Problem under Fuzzy Rules Constrants 1 2 Sanjay Jan and Kalash Lachhwan

More information

Classifier Selection Based on Data Complexity Measures *

Classifier Selection Based on Data Complexity Measures * Classfer Selecton Based on Data Complexty Measures * Edth Hernández-Reyes, J.A. Carrasco-Ochoa, and J.Fco. Martínez-Trndad Natonal Insttute for Astrophyscs, Optcs and Electroncs, Lus Enrque Erro No.1 Sta.

More information

Parallelism for Nested Loops with Non-uniform and Flow Dependences

Parallelism for Nested Loops with Non-uniform and Flow Dependences Parallelsm for Nested Loops wth Non-unform and Flow Dependences Sam-Jn Jeong Dept. of Informaton & Communcaton Engneerng, Cheonan Unversty, 5, Anseo-dong, Cheonan, Chungnam, 330-80, Korea. seong@cheonan.ac.kr

More information

A Binarization Algorithm specialized on Document Images and Photos

A Binarization Algorithm specialized on Document Images and Photos A Bnarzaton Algorthm specalzed on Document mages and Photos Ergna Kavalleratou Dept. of nformaton and Communcaton Systems Engneerng Unversty of the Aegean kavalleratou@aegean.gr Abstract n ths paper, a

More information

BOOSTING CLASSIFICATION ACCURACY WITH SAMPLES CHOSEN FROM A VALIDATION SET

BOOSTING CLASSIFICATION ACCURACY WITH SAMPLES CHOSEN FROM A VALIDATION SET 1 BOOSTING CLASSIFICATION ACCURACY WITH SAMPLES CHOSEN FROM A VALIDATION SET TZU-CHENG CHUANG School of Electrcal and Computer Engneerng, Purdue Unversty, West Lafayette, Indana 47907 SAUL B. GELFAND School

More information

MULTISPECTRAL IMAGES CLASSIFICATION BASED ON KLT AND ATR AUTOMATIC TARGET RECOGNITION

MULTISPECTRAL IMAGES CLASSIFICATION BASED ON KLT AND ATR AUTOMATIC TARGET RECOGNITION MULTISPECTRAL IMAGES CLASSIFICATION BASED ON KLT AND ATR AUTOMATIC TARGET RECOGNITION Paulo Quntlano 1 & Antono Santa-Rosa 1 Federal Polce Department, Brasla, Brazl. E-mals: quntlano.pqs@dpf.gov.br and

More information

Solving two-person zero-sum game by Matlab

Solving two-person zero-sum game by Matlab Appled Mechancs and Materals Onlne: 2011-02-02 ISSN: 1662-7482, Vols. 50-51, pp 262-265 do:10.4028/www.scentfc.net/amm.50-51.262 2011 Trans Tech Publcatons, Swtzerland Solvng two-person zero-sum game by

More information

CS246: Mining Massive Datasets Jure Leskovec, Stanford University

CS246: Mining Massive Datasets Jure Leskovec, Stanford University CS46: Mnng Massve Datasets Jure Leskovec, Stanford Unversty http://cs46.stanford.edu /19/013 Jure Leskovec, Stanford CS46: Mnng Massve Datasets, http://cs46.stanford.edu Perceptron: y = sgn( x Ho to fnd

More information

EECS 730 Introduction to Bioinformatics Sequence Alignment. Luke Huan Electrical Engineering and Computer Science

EECS 730 Introduction to Bioinformatics Sequence Alignment. Luke Huan Electrical Engineering and Computer Science EECS 730 Introducton to Bonformatcs Sequence Algnment Luke Huan Electrcal Engneerng and Computer Scence http://people.eecs.ku.edu/~huan/ HMM Π s a set of states Transton Probabltes a kl Pr( l 1 k Probablty

More information

Meta-heuristics for Multidimensional Knapsack Problems

Meta-heuristics for Multidimensional Knapsack Problems 2012 4th Internatonal Conference on Computer Research and Development IPCSIT vol.39 (2012) (2012) IACSIT Press, Sngapore Meta-heurstcs for Multdmensonal Knapsack Problems Zhbao Man + Computer Scence Department,

More information

A Modified Median Filter for the Removal of Impulse Noise Based on the Support Vector Machines

A Modified Median Filter for the Removal of Impulse Noise Based on the Support Vector Machines A Modfed Medan Flter for the Removal of Impulse Nose Based on the Support Vector Machnes H. GOMEZ-MORENO, S. MALDONADO-BASCON, F. LOPEZ-FERRERAS, M. UTRILLA- MANSO AND P. GIL-JIMENEZ Departamento de Teoría

More information

Face Recognition Method Based on Within-class Clustering SVM

Face Recognition Method Based on Within-class Clustering SVM Face Recognton Method Based on Wthn-class Clusterng SVM Yan Wu, Xao Yao and Yng Xa Department of Computer Scence and Engneerng Tong Unversty Shangha, Chna Abstract - A face recognton method based on Wthn-class

More information

Subspace clustering. Clustering. Fundamental to all clustering techniques is the choice of distance measure between data points;

Subspace clustering. Clustering. Fundamental to all clustering techniques is the choice of distance measure between data points; Subspace clusterng Clusterng Fundamental to all clusterng technques s the choce of dstance measure between data ponts; D q ( ) ( ) 2 x x = x x, j k = 1 k jk Squared Eucldean dstance Assumpton: All features

More information

Feature Reduction and Selection

Feature Reduction and Selection Feature Reducton and Selecton Dr. Shuang LIANG School of Software Engneerng TongJ Unversty Fall, 2012 Today s Topcs Introducton Problems of Dmensonalty Feature Reducton Statstc methods Prncpal Components

More information

Hermite Splines in Lie Groups as Products of Geodesics

Hermite Splines in Lie Groups as Products of Geodesics Hermte Splnes n Le Groups as Products of Geodescs Ethan Eade Updated May 28, 2017 1 Introducton 1.1 Goal Ths document defnes a curve n the Le group G parametrzed by tme and by structural parameters n the

More information

Classifying Acoustic Transient Signals Using Artificial Intelligence

Classifying Acoustic Transient Signals Using Artificial Intelligence Classfyng Acoustc Transent Sgnals Usng Artfcal Intellgence Steve Sutton, Unversty of North Carolna At Wlmngton (suttons@charter.net) Greg Huff, Unversty of North Carolna At Wlmngton (jgh7476@uncwl.edu)

More information

Outline. Type of Machine Learning. Examples of Application. Unsupervised Learning

Outline. Type of Machine Learning. Examples of Application. Unsupervised Learning Outlne Artfcal Intellgence and ts applcatons Lecture 8 Unsupervsed Learnng Professor Danel Yeung danyeung@eee.org Dr. Patrck Chan patrckchan@eee.org South Chna Unversty of Technology, Chna Introducton

More information

CS 534: Computer Vision Model Fitting

CS 534: Computer Vision Model Fitting CS 534: Computer Vson Model Fttng Sprng 004 Ahmed Elgammal Dept of Computer Scence CS 534 Model Fttng - 1 Outlnes Model fttng s mportant Least-squares fttng Maxmum lkelhood estmaton MAP estmaton Robust

More information

An Application of the Dulmage-Mendelsohn Decomposition to Sparse Null Space Bases of Full Row Rank Matrices

An Application of the Dulmage-Mendelsohn Decomposition to Sparse Null Space Bases of Full Row Rank Matrices Internatonal Mathematcal Forum, Vol 7, 2012, no 52, 2549-2554 An Applcaton of the Dulmage-Mendelsohn Decomposton to Sparse Null Space Bases of Full Row Rank Matrces Mostafa Khorramzadeh Department of Mathematcal

More information

Problem Set 3 Solutions

Problem Set 3 Solutions Introducton to Algorthms October 4, 2002 Massachusetts Insttute of Technology 6046J/18410J Professors Erk Demane and Shaf Goldwasser Handout 14 Problem Set 3 Solutons (Exercses were not to be turned n,

More information

2x x l. Module 3: Element Properties Lecture 4: Lagrange and Serendipity Elements

2x x l. Module 3: Element Properties Lecture 4: Lagrange and Serendipity Elements Module 3: Element Propertes Lecture : Lagrange and Serendpty Elements 5 In last lecture note, the nterpolaton functons are derved on the bass of assumed polynomal from Pascal s trangle for the fled varable.

More information

Application of Maximum Entropy Markov Models on the Protein Secondary Structure Predictions

Application of Maximum Entropy Markov Models on the Protein Secondary Structure Predictions Applcaton of Maxmum Entropy Markov Models on the Proten Secondary Structure Predctons Yohan Km Department of Chemstry and Bochemstry Unversty of Calforna, San Dego La Jolla, CA 92093 ykm@ucsd.edu Abstract

More information

Announcements. Supervised Learning

Announcements. Supervised Learning Announcements See Chapter 5 of Duda, Hart, and Stork. Tutoral by Burge lnked to on web page. Supervsed Learnng Classfcaton wth labeled eamples. Images vectors n hgh-d space. Supervsed Learnng Labeled eamples

More information

PERFORMANCE EVALUATION FOR SCENE MATCHING ALGORITHMS BY SVM

PERFORMANCE EVALUATION FOR SCENE MATCHING ALGORITHMS BY SVM PERFORMACE EVALUAIO FOR SCEE MACHIG ALGORIHMS BY SVM Zhaohu Yang a, b, *, Yngyng Chen a, Shaomng Zhang a a he Research Center of Remote Sensng and Geomatc, ongj Unversty, Shangha 200092, Chna - yzhac@63.com

More information

S1 Note. Basis functions.

S1 Note. Basis functions. S1 Note. Bass functons. Contents Types of bass functons...1 The Fourer bass...2 B-splne bass...3 Power and type I error rates wth dfferent numbers of bass functons...4 Table S1. Smulaton results of type

More information

Content Based Image Retrieval Using 2-D Discrete Wavelet with Texture Feature with Different Classifiers

Content Based Image Retrieval Using 2-D Discrete Wavelet with Texture Feature with Different Classifiers IOSR Journal of Electroncs and Communcaton Engneerng (IOSR-JECE) e-issn: 78-834,p- ISSN: 78-8735.Volume 9, Issue, Ver. IV (Mar - Apr. 04), PP 0-07 Content Based Image Retreval Usng -D Dscrete Wavelet wth

More information

Spam Filtering Based on Support Vector Machines with Taguchi Method for Parameter Selection

Spam Filtering Based on Support Vector Machines with Taguchi Method for Parameter Selection E-mal Spam Flterng Based on Support Vector Machnes wth Taguch Method for Parameter Selecton We-Chh Hsu, Tsan-Yng Yu E-mal Spam Flterng Based on Support Vector Machnes wth Taguch Method for Parameter Selecton

More information

Problem Definitions and Evaluation Criteria for Computational Expensive Optimization

Problem Definitions and Evaluation Criteria for Computational Expensive Optimization Problem efntons and Evaluaton Crtera for Computatonal Expensve Optmzaton B. Lu 1, Q. Chen and Q. Zhang 3, J. J. Lang 4, P. N. Suganthan, B. Y. Qu 6 1 epartment of Computng, Glyndwr Unversty, UK Faclty

More information

A mathematical programming approach to the analysis, design and scheduling of offshore oilfields

A mathematical programming approach to the analysis, design and scheduling of offshore oilfields 17 th European Symposum on Computer Aded Process Engneerng ESCAPE17 V. Plesu and P.S. Agach (Edtors) 2007 Elsever B.V. All rghts reserved. 1 A mathematcal programmng approach to the analyss, desgn and

More information

Lecture 4: Principal components

Lecture 4: Principal components /3/6 Lecture 4: Prncpal components 3..6 Multvarate lnear regresson MLR s optmal for the estmaton data...but poor for handlng collnear data Covarance matrx s not nvertble (large condton number) Robustness

More information

Recommended Items Rating Prediction based on RBF Neural Network Optimized by PSO Algorithm

Recommended Items Rating Prediction based on RBF Neural Network Optimized by PSO Algorithm Recommended Items Ratng Predcton based on RBF Neural Network Optmzed by PSO Algorthm Chengfang Tan, Cayn Wang, Yuln L and Xx Q Abstract In order to mtgate the data sparsty and cold-start problems of recommendaton

More information

SLAM Summer School 2006 Practical 2: SLAM using Monocular Vision

SLAM Summer School 2006 Practical 2: SLAM using Monocular Vision SLAM Summer School 2006 Practcal 2: SLAM usng Monocular Vson Javer Cvera, Unversty of Zaragoza Andrew J. Davson, Imperal College London J.M.M Montel, Unversty of Zaragoza. josemar@unzar.es, jcvera@unzar.es,

More information

A Fast Content-Based Multimedia Retrieval Technique Using Compressed Data

A Fast Content-Based Multimedia Retrieval Technique Using Compressed Data A Fast Content-Based Multmeda Retreval Technque Usng Compressed Data Borko Furht and Pornvt Saksobhavvat NSF Multmeda Laboratory Florda Atlantc Unversty, Boca Raton, Florda 3343 ABSTRACT In ths paper,

More information

Network Intrusion Detection Based on PSO-SVM

Network Intrusion Detection Based on PSO-SVM TELKOMNIKA Indonesan Journal of Electrcal Engneerng Vol.1, No., February 014, pp. 150 ~ 1508 DOI: http://dx.do.org/10.11591/telkomnka.v1.386 150 Network Intruson Detecton Based on PSO-SVM Changsheng Xang*

More information

LECTURE : MANIFOLD LEARNING

LECTURE : MANIFOLD LEARNING LECTURE : MANIFOLD LEARNING Rta Osadchy Some sldes are due to L.Saul, V. C. Raykar, N. Verma Topcs PCA MDS IsoMap LLE EgenMaps Done! Dmensonalty Reducton Data representaton Inputs are real-valued vectors

More information

On Supporting Identification in a Hand-Based Biometric Framework

On Supporting Identification in a Hand-Based Biometric Framework On Supportng Identfcaton n a Hand-Based Bometrc Framework Pe-Fang Guo 1, Prabr Bhattacharya 2, and Nawwaf Kharma 1 1 Electrcal & Computer Engneerng, Concorda Unversty, 1455 de Masonneuve Blvd., Montreal,

More information

Human Face Recognition Using Generalized. Kernel Fisher Discriminant

Human Face Recognition Using Generalized. Kernel Fisher Discriminant Human Face Recognton Usng Generalzed Kernel Fsher Dscrmnant ng-yu Sun,2 De-Shuang Huang Ln Guo. Insttute of Intellgent Machnes, Chnese Academy of Scences, P.O.ox 30, Hefe, Anhu, Chna. 2. Department of

More information

Kent State University CS 4/ Design and Analysis of Algorithms. Dept. of Math & Computer Science LECT-16. Dynamic Programming

Kent State University CS 4/ Design and Analysis of Algorithms. Dept. of Math & Computer Science LECT-16. Dynamic Programming CS 4/560 Desgn and Analyss of Algorthms Kent State Unversty Dept. of Math & Computer Scence LECT-6 Dynamc Programmng 2 Dynamc Programmng Dynamc Programmng, lke the dvde-and-conquer method, solves problems

More information

Machine Learning: Algorithms and Applications

Machine Learning: Algorithms and Applications 14/05/1 Machne Learnng: Algorthms and Applcatons Florano Zn Free Unversty of Bozen-Bolzano Faculty of Computer Scence Academc Year 011-01 Lecture 10: 14 May 01 Unsupervsed Learnng cont Sldes courtesy of

More information

A Unified Framework for Semantics and Feature Based Relevance Feedback in Image Retrieval Systems

A Unified Framework for Semantics and Feature Based Relevance Feedback in Image Retrieval Systems A Unfed Framework for Semantcs and Feature Based Relevance Feedback n Image Retreval Systems Ye Lu *, Chunhu Hu 2, Xngquan Zhu 3*, HongJang Zhang 2, Qang Yang * School of Computng Scence Smon Fraser Unversty

More information

Predicting Transcription Factor Binding Sites with an Ensemble of Hidden Markov Models

Predicting Transcription Factor Binding Sites with an Ensemble of Hidden Markov Models Vol. 3, No. 1, Fall, 2016, pp. 1-10 ISSN 2158-835X (prnt), 2158-8368 (onlne), All Rghts Reserved Predctng Transcrpton Factor Bndng Stes wth an Ensemble of Hdden Markov Models Yngle Song 1 and Albert Y.

More information

Efficient Text Classification by Weighted Proximal SVM *

Efficient Text Classification by Weighted Proximal SVM * Effcent ext Classfcaton by Weghted Proxmal SVM * Dong Zhuang 1, Benyu Zhang, Qang Yang 3, Jun Yan 4, Zheng Chen, Yng Chen 1 1 Computer Scence and Engneerng, Bejng Insttute of echnology, Bejng 100081, Chna

More information

Clustering System and Clustering Support Vector Machine for Local Protein Structure Prediction

Clustering System and Clustering Support Vector Machine for Local Protein Structure Prediction Georga State Unversty ScholarWorks @ Georga State Unversty Computer Scence Dssertatons Department of Computer Scence 8-2-2006 Clusterng System and Clusterng Support Vector Machne for Local Proten Structure

More information

Unsupervised Learning

Unsupervised Learning Pattern Recognton Lecture 8 Outlne Introducton Unsupervsed Learnng Parametrc VS Non-Parametrc Approach Mxture of Denstes Maxmum-Lkelhood Estmates Clusterng Prof. Danel Yeung School of Computer Scence and

More information

Face Recognition Based on SVM and 2DPCA

Face Recognition Based on SVM and 2DPCA Vol. 4, o. 3, September, 2011 Face Recognton Based on SVM and 2DPCA Tha Hoang Le, Len Bu Faculty of Informaton Technology, HCMC Unversty of Scence Faculty of Informaton Scences and Engneerng, Unversty

More information

6.854 Advanced Algorithms Petar Maymounkov Problem Set 11 (November 23, 2005) With: Benjamin Rossman, Oren Weimann, and Pouya Kheradpour

6.854 Advanced Algorithms Petar Maymounkov Problem Set 11 (November 23, 2005) With: Benjamin Rossman, Oren Weimann, and Pouya Kheradpour 6.854 Advanced Algorthms Petar Maymounkov Problem Set 11 (November 23, 2005) Wth: Benjamn Rossman, Oren Wemann, and Pouya Kheradpour Problem 1. We reduce vertex cover to MAX-SAT wth weghts, such that the

More information

LECTURE NOTES Duality Theory, Sensitivity Analysis, and Parametric Programming

LECTURE NOTES Duality Theory, Sensitivity Analysis, and Parametric Programming CEE 60 Davd Rosenberg p. LECTURE NOTES Dualty Theory, Senstvty Analyss, and Parametrc Programmng Learnng Objectves. Revew the prmal LP model formulaton 2. Formulate the Dual Problem of an LP problem (TUES)

More information

Range images. Range image registration. Examples of sampling patterns. Range images and range surfaces

Range images. Range image registration. Examples of sampling patterns. Range images and range surfaces Range mages For many structured lght scanners, the range data forms a hghly regular pattern known as a range mage. he samplng pattern s determned by the specfc scanner. Range mage regstraton 1 Examples

More information

Face Recognition University at Buffalo CSE666 Lecture Slides Resources:

Face Recognition University at Buffalo CSE666 Lecture Slides Resources: Face Recognton Unversty at Buffalo CSE666 Lecture Sldes Resources: http://www.face-rec.org/algorthms/ Overvew of face recognton algorthms Correlaton - Pxel based correspondence between two face mages Structural

More information

An Optimal Algorithm for Prufer Codes *

An Optimal Algorithm for Prufer Codes * J. Software Engneerng & Applcatons, 2009, 2: 111-115 do:10.4236/jsea.2009.22016 Publshed Onlne July 2009 (www.scrp.org/journal/jsea) An Optmal Algorthm for Prufer Codes * Xaodong Wang 1, 2, Le Wang 3,

More information

SUMMARY... I TABLE OF CONTENTS...II INTRODUCTION...

SUMMARY... I TABLE OF CONTENTS...II INTRODUCTION... Summary A follow-the-leader robot system s mplemented usng Dscrete-Event Supervsory Control methods. The system conssts of three robots, a leader and two followers. The dea s to get the two followers to

More information

Optimizing Document Scoring for Query Retrieval

Optimizing Document Scoring for Query Retrieval Optmzng Document Scorng for Query Retreval Brent Ellwen baellwe@cs.stanford.edu Abstract The goal of ths project was to automate the process of tunng a document query engne. Specfcally, I used machne learnng

More information

Taxonomy of Large Margin Principle Algorithms for Ordinal Regression Problems

Taxonomy of Large Margin Principle Algorithms for Ordinal Regression Problems Taxonomy of Large Margn Prncple Algorthms for Ordnal Regresson Problems Amnon Shashua Computer Scence Department Stanford Unversty Stanford, CA 94305 emal: shashua@cs.stanford.edu Anat Levn School of Computer

More information

BioTechnology. An Indian Journal FULL PAPER. Trade Science Inc.

BioTechnology. An Indian Journal FULL PAPER. Trade Science Inc. [Type text] [Type text] [Type text] ISSN : 0974-74 Volume 0 Issue BoTechnology 04 An Indan Journal FULL PAPER BTAIJ 0() 04 [684-689] Revew on Chna s sports ndustry fnancng market based on market -orented

More information

Machine Learning. Topic 6: Clustering

Machine Learning. Topic 6: Clustering Machne Learnng Topc 6: lusterng lusterng Groupng data nto (hopefully useful) sets. Thngs on the left Thngs on the rght Applcatons of lusterng Hypothess Generaton lusters mght suggest natural groups. Hypothess

More information

A Robust Method for Estimating the Fundamental Matrix

A Robust Method for Estimating the Fundamental Matrix Proc. VIIth Dgtal Image Computng: Technques and Applcatons, Sun C., Talbot H., Ourseln S. and Adraansen T. (Eds.), 0- Dec. 003, Sydney A Robust Method for Estmatng the Fundamental Matrx C.L. Feng and Y.S.

More information

Protein Secondary Structure Prediction Using Support Vector Machines, Nueral Networks and Genetic Algorithms

Protein Secondary Structure Prediction Using Support Vector Machines, Nueral Networks and Genetic Algorithms Georga State Unversty ScholarWorks @ Georga State Unversty Computer Scence Theses Department of Computer Scence 5-3-2007 Proten Secondary Structure Predcton Usng Support Vector Machnes, Nueral Networks

More information

Application of Improved Fish Swarm Algorithm in Cloud Computing Resource Scheduling

Application of Improved Fish Swarm Algorithm in Cloud Computing Resource Scheduling , pp.40-45 http://dx.do.org/10.14257/astl.2017.143.08 Applcaton of Improved Fsh Swarm Algorthm n Cloud Computng Resource Schedulng Yu Lu, Fangtao Lu School of Informaton Engneerng, Chongqng Vocatonal Insttute

More information

Lecture 5: Multilayer Perceptrons

Lecture 5: Multilayer Perceptrons Lecture 5: Multlayer Perceptrons Roger Grosse 1 Introducton So far, we ve only talked about lnear models: lnear regresson and lnear bnary classfers. We noted that there are functons that can t be represented

More information

Complex System Reliability Evaluation using Support Vector Machine for Incomplete Data-set

Complex System Reliability Evaluation using Support Vector Machine for Incomplete Data-set Internatonal Journal of Performablty Engneerng, Vol. 7, No. 1, January 2010, pp.32-42. RAMS Consultants Prnted n Inda Complex System Relablty Evaluaton usng Support Vector Machne for Incomplete Data-set

More information

Learning-Based Top-N Selection Query Evaluation over Relational Databases

Learning-Based Top-N Selection Query Evaluation over Relational Databases Learnng-Based Top-N Selecton Query Evaluaton over Relatonal Databases Lang Zhu *, Wey Meng ** * School of Mathematcs and Computer Scence, Hebe Unversty, Baodng, Hebe 071002, Chna, zhu@mal.hbu.edu.cn **

More information

A Deflected Grid-based Algorithm for Clustering Analysis

A Deflected Grid-based Algorithm for Clustering Analysis A Deflected Grd-based Algorthm for Clusterng Analyss NANCY P. LIN, CHUNG-I CHANG, HAO-EN CHUEH, HUNG-JEN CHEN, WEI-HUA HAO Department of Computer Scence and Informaton Engneerng Tamkang Unversty 5 Yng-chuan

More information

An Entropy-Based Approach to Integrated Information Needs Assessment

An Entropy-Based Approach to Integrated Information Needs Assessment Dstrbuton Statement A: Approved for publc release; dstrbuton s unlmted. An Entropy-Based Approach to ntegrated nformaton Needs Assessment June 8, 2004 Wllam J. Farrell Lockheed Martn Advanced Technology

More information

Overview. Basic Setup [9] Motivation and Tasks. Modularization 2008/2/20 IMPROVED COVERAGE CONTROL USING ONLY LOCAL INFORMATION

Overview. Basic Setup [9] Motivation and Tasks. Modularization 2008/2/20 IMPROVED COVERAGE CONTROL USING ONLY LOCAL INFORMATION Overvew 2 IMPROVED COVERAGE CONTROL USING ONLY LOCAL INFORMATION Introducton Mult- Smulator MASIM Theoretcal Work and Smulaton Results Concluson Jay Wagenpfel, Adran Trachte Motvaton and Tasks Basc Setup

More information

Backpropagation: In Search of Performance Parameters

Backpropagation: In Search of Performance Parameters Bacpropagaton: In Search of Performance Parameters ANIL KUMAR ENUMULAPALLY, LINGGUO BU, and KHOSROW KAIKHAH, Ph.D. Computer Scence Department Texas State Unversty-San Marcos San Marcos, TX-78666 USA ae049@txstate.edu,

More information

Adaptive Virtual Support Vector Machine for the Reliability Analysis of High-Dimensional Problems

Adaptive Virtual Support Vector Machine for the Reliability Analysis of High-Dimensional Problems Proceedngs of the ASME 2 Internatonal Desgn Engneerng Techncal Conferences & Computers and Informaton n Engneerng Conference IDETC/CIE 2 August 29-3, 2, Washngton, D.C., USA DETC2-47538 Adaptve Vrtual

More information

5 The Primal-Dual Method

5 The Primal-Dual Method 5 The Prmal-Dual Method Orgnally desgned as a method for solvng lnear programs, where t reduces weghted optmzaton problems to smpler combnatoral ones, the prmal-dual method (PDM) has receved much attenton

More information

An Improved Image Segmentation Algorithm Based on the Otsu Method

An Improved Image Segmentation Algorithm Based on the Otsu Method 3th ACIS Internatonal Conference on Software Engneerng, Artfcal Intellgence, Networkng arallel/dstrbuted Computng An Improved Image Segmentaton Algorthm Based on the Otsu Method Mengxng Huang, enjao Yu,

More information

A Fast Visual Tracking Algorithm Based on Circle Pixels Matching

A Fast Visual Tracking Algorithm Based on Circle Pixels Matching A Fast Vsual Trackng Algorthm Based on Crcle Pxels Matchng Zhqang Hou hou_zhq@sohu.com Chongzhao Han czhan@mal.xjtu.edu.cn Ln Zheng Abstract: A fast vsual trackng algorthm based on crcle pxels matchng

More information

Fixing Max-Product: Convergent Message Passing Algorithms for MAP LP-Relaxations

Fixing Max-Product: Convergent Message Passing Algorithms for MAP LP-Relaxations Fxng Max-Product: Convergent Message Passng Algorthms for MAP LP-Relaxatons Amr Globerson Tomm Jaakkola Computer Scence and Artfcal Intellgence Laboratory Massachusetts Insttute of Technology Cambrdge,

More information

A PATTERN RECOGNITION APPROACH TO IMAGE SEGMENTATION

A PATTERN RECOGNITION APPROACH TO IMAGE SEGMENTATION 1 THE PUBLISHING HOUSE PROCEEDINGS OF THE ROMANIAN ACADEMY, Seres A, OF THE ROMANIAN ACADEMY Volume 4, Number 2/2003, pp.000-000 A PATTERN RECOGNITION APPROACH TO IMAGE SEGMENTATION Tudor BARBU Insttute

More information

GA-Based Learning Algorithms to Identify Fuzzy Rules for Fuzzy Neural Networks

GA-Based Learning Algorithms to Identify Fuzzy Rules for Fuzzy Neural Networks Seventh Internatonal Conference on Intellgent Systems Desgn and Applcatons GA-Based Learnng Algorthms to Identfy Fuzzy Rules for Fuzzy Neural Networks K Almejall, K Dahal, Member IEEE, and A Hossan, Member

More information

INF 4300 Support Vector Machine Classifiers (SVM) Anne Solberg

INF 4300 Support Vector Machine Classifiers (SVM) Anne Solberg INF 43 Support Vector Machne Classfers (SVM) Anne Solberg (anne@f.uo.no) 9..7 Lnear classfers th mamum margn for toclass problems The kernel trck from lnear to a hghdmensonal generalzaton Generaton from

More information

Improvement of Spatial Resolution Using BlockMatching Based Motion Estimation and Frame. Integration

Improvement of Spatial Resolution Using BlockMatching Based Motion Estimation and Frame. Integration Improvement of Spatal Resoluton Usng BlockMatchng Based Moton Estmaton and Frame Integraton Danya Suga and Takayuk Hamamoto Graduate School of Engneerng, Tokyo Unversty of Scence, 6-3-1, Nuku, Katsuska-ku,

More information

Solving Route Planning Using Euler Path Transform

Solving Route Planning Using Euler Path Transform Solvng Route Plannng Usng Euler Path ransform Y-Chong Zeng Insttute of Informaton Scence Academa Snca awan ychongzeng@s.snca.edu.tw Abstract hs paper presents a method to solve route plannng problem n

More information

Corner-Based Image Alignment using Pyramid Structure with Gradient Vector Similarity

Corner-Based Image Alignment using Pyramid Structure with Gradient Vector Similarity Journal of Sgnal and Informaton Processng, 013, 4, 114-119 do:10.436/jsp.013.43b00 Publshed Onlne August 013 (http://www.scrp.org/journal/jsp) Corner-Based Image Algnment usng Pyramd Structure wth Gradent

More information

An Image Fusion Approach Based on Segmentation Region

An Image Fusion Approach Based on Segmentation Region Rong Wang, L-Qun Gao, Shu Yang, Yu-Hua Cha, and Yan-Chun Lu An Image Fuson Approach Based On Segmentaton Regon An Image Fuson Approach Based on Segmentaton Regon Rong Wang, L-Qun Gao, Shu Yang 3, Yu-Hua

More information

Evolutionary Support Vector Regression based on Multi-Scale Radial Basis Function Kernel

Evolutionary Support Vector Regression based on Multi-Scale Radial Basis Function Kernel Eolutonary Support Vector Regresson based on Mult-Scale Radal Bass Functon Kernel Tanasanee Phenthrakul and Boonserm Kjsrkul Abstract Kernel functons are used n support ector regresson (SVR) to compute

More information

Outline. Self-Organizing Maps (SOM) US Hebbian Learning, Cntd. The learning rule is Hebbian like:

Outline. Self-Organizing Maps (SOM) US Hebbian Learning, Cntd. The learning rule is Hebbian like: Self-Organzng Maps (SOM) Turgay İBRİKÇİ, PhD. Outlne Introducton Structures of SOM SOM Archtecture Neghborhoods SOM Algorthm Examples Summary 1 2 Unsupervsed Hebban Learnng US Hebban Learnng, Cntd 3 A

More information

GSLM Operations Research II Fall 13/14

GSLM Operations Research II Fall 13/14 GSLM 58 Operatons Research II Fall /4 6. Separable Programmng Consder a general NLP mn f(x) s.t. g j (x) b j j =. m. Defnton 6.. The NLP s a separable program f ts objectve functon and all constrants are

More information