An Improved Particle Swarm Optimization for Feature Selection

Size: px
Start display at page:

Download "An Improved Particle Swarm Optimization for Feature Selection"

Transcription

1 Journal of Bonc Engneerng 8 (20)?????? An Improved Partcle Swarm Optmzaton for Feature Selecton Yuannng Lu,2, Gang Wang,2, Hulng Chen,2, Hao Dong,2, Xaodong Zhu,2, Sujng Wang,2 Abstract. College of Computer Scence and Technology, Jln Unversty, Changchun 3002, P. R. Chna 2. Key Laboratory of Symbolc Computaton and Knowledge Engneerng of Mnstry of Educaton, Jln Unversty, Changchun 3002, P. R. Chna Partcle Swarm Optmzaton (PSO) s a popular and bonc algorthm based on the socal behavor assocated wth brd flockng for optmzaton problems. To mantan the dversty of swarms, a few studes of mult-swarm strategy have been reported. However, the competton among swarms, reservaton or destructon of a swarm, has not been consdered further. In ths paper, we formulate four rules by ntroducng the mechansm for survval of the fttest, whch smulates the competton among the swarms. Based on the mechansm, we desgn a modfed Mult-Swarm PSO (MSPSO) to solve dscrete problems, whch conssts of a number of sub-swarms and a mult-swarm scheduler that can montor and control each sub-swarm usng the rules. To further settle the feature selecton problems, we propose an Improved Feature Selecton (IFS) method by ntegratng MSPSO, Support Vector Machnes (SVM) wth F-score method. The IFS method ams to acheve hgher generalzaton capablty through performng kernel parameter optmzaton and feature selecton smultaneously. The performance of the proposed method s compared wth that of the standard PSO based, Genetc Algorthm (GA) based and the grd search based methods on 0 benchmark datasets, taken from UCI machne learnng and StatLog databases. The numercal results and statstcal analyss show that the proposed IFS method performs sgnfcantly better than the other three methods n terms of predcton accuracy wth smaller subset of. Keywords: partcle swarm optmzaton, feature selecton, data mnng, support vector machnes Copyrght 20, Jln Unversty. Publshed by Elsever Lmted and Scence Press. All rghts reserved. do: Introducton Feature selecton s one of the most mportant factors whch can nfluence the classfcaton accuracy rate. If the dataset contans a number of, the dmenson of the space wll be large and non-clean, degradng the classfcaton accuracy rate. An effcent and robust feature selecton method can elmnate nosy, rrelevant and redundant data []. Feature subset selecton algorthms can be categorzed nto two types: flter algorthms and wrapper algorthms. Flter algorthms select the feature subset before the applcaton of any classfcaton algorthm, and remove the less mportant from the subset. Wrapper methods defne the learnng algorthm, the performance crtera and the search strategy. The learnng algorthm searches for the subset usng the tranng data and the performance of the current subset. Partcle Swarm Optmzaton (PSO) was motvated from the smulaton of smplfed socal behavor of brd flockng, frstly developed by Kennedy and Eberhart [2 3]. It s easy to mplement wth few parameters, and t s wdely used to solve the optmzaton problems, as well as feature selecton problem [4 5]. Varous attempts have been made to mprove the performance of standard PSO n recent years. However, few studes have put emphass on researchng nto mult-swarm strategy. Usually, the PSO-based algorthms only have one swarm that contans a number of partcles. The PSO-based algorthms usng mult-swarm strategy have more exploraton and explotaton abltes due to the fact that dfferent swarms have the possblty to explore dfferent parts of the soluton space [6]. On the other hand, standard PSO converges over tme, thereby losng dversty, and thus ther ablty to quckly react to a peak s move. The mult-swarm PSO can sustan the dversty of swarms, and ensure ts adaptablty, thereby mprovng the performance of PSO. Correspondng author: Xaodong Zhu E-mal: zhuxaodong.jlu@gmal.com

2 2 Journal of Bonc Engneerng (20) Vol.8 No.2 Blackwell and Branke [7] splt the populaton of ssts of a number of sub-swarms and a schedulng module. partcles nto a set of nteractng swarms. They used a smple competton mechansm among swarms that are close to each other. The wnner s the swarm wth the best functon value at ts swarm attractor. The loser s expelled and rentalzed n the search space, otherwse the wnner remans. Parrott and L [8] dvded the swarm populaton nto speces subpopulatons based on ther smlarty. Addtonal duplcated partcles are removed when partcles are dentfed as havng the same ftness wth the speces seed wthn the same speces. After destroyng the duplcated ones, the new partcles are added randomly untl ts sze s resumed to ts ntal sze. Nu et al. [9] proposed Mult-swarm Cooperatve Partcle Swarm Optmzer (MCPSO) based on a master-slave model, n whch a populaton conssts of one master swarm and several slave swarms. MCPSO s based on an antagonstc scenaro, where the master swarm enhances ts partcles by a seres of compettons wth the slave warms. The master swarm enhances ts partcles based on drect competton wth the slave swarms, and the most ftted partcles n all the swarms possess the opportunty to gude the fght drecton of the partcles n the master swarm. However, the studes mentoned above have only solved the tradtonal optmzaton problems, namely contnuous parameter optmzaton. Our proposed Mult-Swarm Partcle Swarm Optmzaton (MSPSO) The survval of the fttest s ntroduced to decde whether a sub-swarm should be destroyed or reserved. To acheve that goal, 4 rules are desgned. The schedulng module montors and controls each sub-swarm accordng to the rule durng the teratons. (2) The F-score [0], whch can calculate the score of each feature, was ntroduced to evaluate the results of the feature selecton. The objectve functon s desgned accordng to classfcaton accuracy rate and the feature scores. (3) An Improved Feature Selecton (IFS) method was proposed, whch conssts of two stages. In the frst stage, both the Support Vector Machnes (SVM) parameter optmzaton and the feature selecton are dynamcally executed by MSPSO. In the second stage, SVM model performs the classfcaton tasks usng these optmal values and selected va 0-fold cross valdaton. The remander of ths paper s organzed as follows. Secton 2 revews basc prncples of PSO and SVM. Secton 3 descrbes the objectve functon, mult-swarm schedulng module and IFS approach n detal. Secton 4 presents the expermental results on 0 benchmark date sets. Fnally, secton 5 summarzes the concluson. 2 Basc prncples 2. Partcle swarm optmzaton can not only solve the contnuous parameter problems PSO orgnated from the smulaton of socal behavor of brds n a flock [2 3]. In PSO, each partcle fles but also the dscrete problems. Moreover, to mantan the dversty of swarms, they do not change the number of n the search space wth a velocty adjusted by ts own partcles, as well as the number of swarms, thereby gnorng the competton among the swarms. In ths paper, flyng memory and ts companon s flyng experence. Each partcle has ts objectve functon value whch s we propose MSPSO algorthm based on a modfed decded by a ftness functon: mult-swarm PSO through ntroducng the mechansm for survval of the fttest to descrbe the competton t t t t t t vd = w vd + c r ( pd xd ) + c2 r2 ( pgd xd ), () among the swarms. Four rules are desgned accordng to the mechansm, n whch the number of sub-swarms s allowed to reduce durng the teratons, namely, that some of the sub-swarms are destroyed durng the teratons, and the destroyed sub-swarms can not be reconstructed any more. To the best of our knowledge, ths s the frst paper to apply mult-swarm PSO to feature selecton problem. The man nnovatons n ths paper are descrbed as follows: () A MSPSO algorthm was proposed, whch con- where represents the th partcle and d s the dmenson of the soluton space, c denotes the cognton learnng factor, and c 2 ndcates the socal learnng factor, r and t r 2 are random numbers unformly dstrbuted n (0,), p d t and p gd stand for the poston wth the best ftness found so far for the th partcle and the best poston n the neghborhood, v t d and v t d are the veloctes at tme t and tme t, respectvely, and x t d s the poston of th partcle at tme t. Each partcle then moves to a new potental soluton based on the followng equaton:

3 Lu et al.: An Improved Partcle Swarm Optmzaton for Feature Selecton 3 x + = x + v, d =,2,..., D, (2) t t t d d d Kennedy and Eberhart [] proposed a bnary PSO n whch a partcle moves n a state space restrcted to 0 and on each dmenson, n terms of the changes n probabltes that a bt wll be n one state or the other: x d, rand( ) < S( vd, ), = (3) 0 Sv () =. v e (4) + The functon S(v) s a sgmod lmtng transformaton and rand( ) s a random number selected from a unform dstrbuton n [0.0,.0]. 2.2 Support vector machnes SVM s specfcally desgned for two-class problems [2 3]. Gven a tranng set of nstance-label pars (x, y ), =, 2,..., m, where x belongs to R n and y belongs to (+, ), the generalzed lnear SVM fnds an optmal separatng value f(x) = (w x) + b. The classfer s: n f ( x) = sgn{ ay( x x) + b}. (5) = For the non-lnear case, SVM wll map the data n a lower dmensonal space nto a hgher-dmensonal space through kernel trck. The classfer s: n f ( x) = sgn{ ayk( x x) + b}, (6) = where sgn{} s the sgn functon, a s Lagrange multpler, x s a tranng sample, x s a sample to be classfed, K(x x) s the kernel functon. Example kernel functon ncludes polynomal functon, lnear functon, and Radal Bass Functon (RBF). In ths work, we nvestgated the RBF kernel functon. 3 IFS approach We have proposed the IFS approach, whch combnes the parameter optmzaton and the feature selecton, n order to obtan the hgher classfcaton accuracy rate. A modfed PSO algorthm named MSPSO s proposed, whch holds a number of sub-swarms scheduled by the mult-swarm schedulng module. The multswarm schedulng module montors all the sub-swarms, and gathers the results from the sub-swarms. The storage of MSPSO s shown n Fg.. The SVM parameters, feature values and system parameters are descrbed n detal. We modfy the PSO to solve dscrete problem accordng to Ref. []. The proposed method conssts of two stages. In the frst stage, both the SVM parameter optmzaton and the feature selecton are dynamcally executed by MSPSO. In the second stage, SVM model performs the classfcaton tasks usng these optmal values and selected feature subsets va 0-fold cross valdaton. An effcent objectve functon s desgned accordng to classfcaton accuracy rate and F-score. The objectve functon conssts of two parts: one s classfcaton accuracy rate and the other s the feature score. Both of them are summed nto one sngle objectve functon by lnear weghtng. The two weghts are θ a and θ b, and each controls the weght of the specfc part. 3. Classfcaton accuracy The classfcaton accuracy for the dataset was measured accordng to followng equaton: N assess( n ) = accuracy( N ) =, n Ν N, (7) f classfy( n) = nc assess( n) = 0 otherwse where N s the set of data tems to be classfed (the test set), n N, nc s the class of the tem n, and classfy(n) returns the classfcaton accuracy rates of n by IFS. 3.2 F-score F-score s a smple technque whch measures the dscrmnaton of two sets of real numbers. Gven tranng vectors X k, k =.2,,m, f the number of postve and negatve nstances are n+ and n, respectvely, then the F-score of the th feature s defned as follows [0] : ( x x ) + ( x x ) F () =,(8) n ( + ) 2 ( ) 2 n+ n ( + ) ( + ) 2 ( ) ( ) 2 ( xk, x ) + ( xk, x ) + k= n k= ( ) where x, x + ( ), x are the averages of the th feature of the whole, postve, and negatve datasets, respectvely. ( ) x + k, s the th feature of the kth postve nstance, and ( ) x k, s the th feature of the kth negatve nstance. The numerator shows the dscrmnaton between the postve and negatve sets, and the denomnator defnes the

4 4 Journal of Bonc Engneerng (20) Vol.8 No.2 one wthn each of the two sets. The larger the F-score s, the more ths feature s dscrmnatve. Both of ths data have low F-scores as n Eq. (8) the denomnator (the sum of varances of the postve and negatve sets) s much larger than the numerator. Xe and Wang [20] proposed the mproved F-score to measure the dscrmnaton between them. Gven tranng vectors xk, k =, 2,, m, and the number of datasets l(lp2), f the number of the jth dataset s nj, j =, 2,, l, then the F-score of the th feature s defned as: F = l j= n j= j k= l ( x x ) ( j) 2 n j ( x x ) ( j) ( j) 2 k, ( ) where x, x j are the average of the th feature of the ( j) whole dataset and the jth dataset respectvely, x k, s the th feature of the kth nstance n the jth dataset. The numerator ndcates the dscrmnaton between each dataset, and denomnator ndcates the one wthn each of dataset. The larger the F-score s, the more ths feature s dscrmnatve. In ths study, we utlze F-score to calculate the score of each attrbute n order to get the weghts of the accordng to F(FS()). Eq. (9) s responsble for calculatng the scores of the feature masks. If the th feature s selected ( represents that feature s selected and 0 represents that feature s not selected), FS() equals the nstance of feature, otherwse FS() equals 0. nstance, f s selected FS() =, (9) 0, f s not selected 3.3 Objectve functon defnton We desgn an objectve functon whch combnes classfcaton accuracy rate and F-score. Objectve functon s the evaluaton crtera for the selected. To get accuracy rate, we need to tran and test the dataset accordng to the selected. Nb F( FS( )) j= ftness = θa accuracy + θb. Nb Fk () k=, (0) In Eq. (0), θ a s the weght for SVM classfcaton accuracy rate, accuracy the classfcaton accuracy rate for the selected, θ b the weght for the score of selected, F(FS()) the functon for calculatng the score of the current, and the total score of the selected and all respectvely are N b F( k) and k = N b j= F( FS( )) 3.4 Mult-swarm schedulng module MSPSO s proposed, whch holds a number of swarms scheduled by the mult-swarm schedulng module. Each swarm controls ts teraton procedure, poston updates, velocty updates, and other parameters respectvely. Each swarm selects dfferent occasons from current computng envronment, then, sends the current results to the mult-swarm schedulng module to decde whether t affects other swarms. The schedulng module montors all the sub-swarms, and gathers the results from the sub-swarms. Fg. shows the structure of mult-swarm schedulng model, whch conssts of a mult-swarm scheduler and some sub-swarms. Each sub-swarm contans a number of partcles. The mult-swarm scheduler can send commands or data to sub-swarms, and vce versa. () The swarm request rule If the current sub-swarm meets the condton accordng to Eq. (), t sends the results whch correspond pbest and gbest values to the mult-swarm scheduler. If S =, the current swarm sends records whch contan the pbest and gbest values, otherwse the current swarm does not send the results. tt t, f d < rand() Ftness tt S =, () tt t 0, f d rand() Ftness tt In Eq. (), d represents a threshold, tt the maxmal teraton number, t the current teraton number. rand( ) s a random number unformly dstrbuted n U (0, ). (2) The mult-swarm scheduler request rule The mult-swarm scheduler montors each subswarm, and sends a request n order to obtan a result form current sub-swarm when the current sub-swarm s valuable. If sub-swarm has sent the swarm request rules more than k n tmes, where k = 3, n =, 2, 3,...,00, 批注 [U]: 仍然建议在第一次出现时进行简单的说明

5 Lu et al.: An Improved Partcle Swarm Optmzaton for Feature Selecton 5 Fg. The structure of mult-swarm schedulng. the mult-swarm scheduler wll send the rule. The mult-swarm scheduler request rule s touched off accordng to evaluatng the actvty level of the current sub-swarm. The more actve the sub-swarm s, the more valuable t s, snce the best result may be n t. (3) The mult-swarm collecton rule The mult-swarm scheduler collects results from the alve sub-swarm and updates pbest and gbest from storage table. (4) The mult-swarm destroyng rule a. If the swarm sends the swarm request rule k tmes and k < f accordng to Eq. (2), then the mult-swarm scheduler destroys the current sub-swarm. b. If the swarm does not change the gbest n pn teratons, then the mult-swarm scheduler destroys the current sub-swarm. We set pn n the ntalzaton of PSO. f = n l= te() l m. pl (2) In Eq. (2), te( ) s the functon for calculatng how many tmes the sub-swarm sends swarm request rule, m a threshold, pl the alve sub-swarm sze. 3.5 MSPSO algorthm Step : Load the dataset from the text fle and convert the dataset from stream format to object format. Store the formatted memory data to temporary table for the ntalzaton of PSO. Intalze the sze of swarms randomly, and assgn dfferent memory to each swarm. Intalze all partcle postons x j and veloctes v j of each swarm wth random values, then calculate objectve functon. Update pbest (local best) and gbest (global best) of each swarm from the table. Go to Step 2. Step 2: Specfy the parameters of each swarm ncludng the lower and upper bounds of the velocty, the sze of partcles, the number of teratons, c (the cognton learnng factor), c 2 (socal learnng factor), d (n Eq. ()), m(n the mult-swarm destroyng rule) and pn(n Eq.(2)). Set teraton number = 0, current partcle number =, tt = sze of partcles, and t = current partcle number. Go to Step 3. Step 3: In each swarm, f current teraton number < teraton number or gbest keeps no changes less than 45 teratons, go to Step 4, otherwse destroy the swarm, and go to Step 0. The man schedulng module updates the pbest, and compares the gbest of current swarm wth the prevous one n the module, then judge whether to

6 6 Journal of Bonc Engneerng (20) Vol.8 No.2 update gbest usng mult-swarm scheduler request rule 400 and 50 respectvely. The searchng ranges for c and γ or not. If gbest or pbest s changed, execute mult-swarm are as follow: c [2 5, 2 5 ], λ [2 5, 2 5 ], [ v max, v max ] collecton rule. s predefned as [ 000, 000] for parameter c, as Step 4: In each swarm, f current partcle number < [ 000, 000] for parameter γ, and as [ 6, 6] for feature partcle sze, go to Step 5, otherwse, go to Step 9. mask. For objectve functon, we set w a and w b to 0.8 and Step 5: In each swarm, get gbest and pbest from the 0.2 accordng to our experence. The followng datasets table and each partcle updates ts poston and velocty. taken from the UCI machne learnng and StatLog databases are used to evaluate the performance of the Go to Step 6. Step 6: Restrct poston and velocty of each ndvdual. Go to Step 7. heart, breast cancer, heart dsease, vehcle slhouettes, proposed IFS approach: Australan, German, Cleveland Step 7: Each partcle calculates ts ftness and updates pbest and gbest. Execute swarm request rule, and agnostc Breast Cancer (WDBC). hll-valley, landsat satellte, sonar, and Wsconsn D- go to Step 8. If the current swarm needs to be destroyed The 0-fold cross valdaton was used to evaluate accordng to mult-swarm destroyng rule, dspose the the classfcaton accuracy. Then the average error across current swarm, and ext. all 0 trals was computed. Because hll-valley and Step 8: current partcle number = current partcle landsat satellte datasets have pre-defned tranng/test number +. Go to Step 4. splts. Thus, except these datasets, all of the expermental results are averaged over the 0 runs of 0-fold Step 9: current teraton number = current teraton number +. Go to Step 3. Cross-Valdaton (CV). Step 0: Execute mult-swarm collecton rule, and ext. Table Dataset descrpton 3.6 Convergence and complexty analyss Convergence analyss and stablty studes have been reported by Clerc and Kennedy [4], Trelea [5], Kadrkamanathan et al. [6], and Jang et al. [7]. The above studes proved condtons whch could lead PSO to converge n lmted teratons. In order to guarantee the convergence of the proposed method, we set the parameters of PSO as ω = 0.9, c = 2, c 2 = 2 (accordng to Refs. [8] and [9]). The tme complexty of the proposed method s O(M N K), where M, N, K are the number of teratons, the number of sub-swarms, the number of partcles respectvely. In the worst case, f the number of subswarms remans unchanged and the number of teraton reaches the maxmum teraton number, the tme complexty s O(M N K). In general, the number of sub-swarms s reduced after some teratons, and thus the tme complexty s M O( L K), where L N. = 4 Experments and results 4. Expermental settng The numbers of teratons and partcles are set to No. Dataset Classes Instances Features Mssng value Australan (Statlog project) Yes 2 German (Statlog project) No 3 Cleveland heart Yes 4 Breast cancer (Wsconsn) Yes 5 Heart dsease (Statlog project) No 6 Vehcle slhouettes (Vehcle) No 7 Hll-valley No 8 Landsat satellte ( Landsat ) No 9 Sonar No 0 WDBC No 4.2 Results Table 2 shows the classfcaton accuracy rates of IFS wth and wthout feature selecton. As shown n Table 2, the IFS wth feature selecton performs sgnfcantly better than IFS wthout feature selecton n almost all cases examned at the sgnfcance level of 0.05, except the Australan dataset. The average classfcaton accuracy rate for each dataset mproved sgnfcantly after feature selecton. The results show that the classfcaton accuracy rates of the IFS approach wth and wthout feature selecton were better than those of grd search n all cases

7 Lu et al.: An Improved Partcle Swarm Optmzaton for Feature Selecton 7 as shown n Table 3. Grd search s a local search method whch s vulnerable to local optmum. Grd search can supply local optmal parameters to SVM, but the search regon s small, and t can not lead SVM to hgher classfcaton accuracy rate. The emprcal analyss ndcates that the developed IFS approach can obtan the optmal parameter values, and fnd a subset of dscrmnatve wthout decreasng the SVM classfcaton accuracy. Table 2 Results of the proposed IFS wth and wthout feature selecton Dataset Number of orgnal Wth feature selecton Number of selected rate (%) Wthout feature selecton rate (%) Par t test P-value Australan ± German ± < 0.00 Cleveland heart 3 6. ± < 0.00 Breast cancer ± < 0.00 Heart dsease ± < 0.00 Vehcle 7 7. ± < 0.00 Hll-valley ± < 0.00 Landsat 36 3 ± < 0.00 Sonar ± < 0.00 WDBC 30 3 ± Table 3 Expermental results summary of IFS wth feature selecton, IFS wthout feature selecton and grd search algorthm Dataset () IFS wth feature selecton (2) IFS wthout feature selecton (3) Grd search Par t test ()vs(3) Par t test (2)vs(3) Australan < 0.00 < 0.00 German < 0.00 < 0.00 Cleveland heart < 0.00 < 0.00 Breast cancer < 0.00 < 0.00 Heart dsease < 0.00 < 0.00 Vehcle < Hll-valley < 0.00 Landsat < Sonar < 0.00 WDBC < The comparson between IFS and GA + SVM by usng feature selecton s shown n Table 4. The detal parameter settngs for GA+SVM were as follows: populaton sze = 500, crossover rate = 0.7, mutaton rate = The classfcaton accuracy rates of IFS wth feature selecton were hgher than GA + SVM for all datasets, whereas the classfcaton accuracy rates of GA + SVM were hgher than IFS wthout feature selecton as shown n Table 4. Therefore, t s mportant to elmnate nosy, rrelevant for ncreasng the classfcaton accuracy rates. Table 4 Comparson between the IFS and GA + SVM approach Dataset Number of orgnal Australan 4 German 23 Cleveland heart 3 Breast cancer 9 Heart dsease 3 Vehcle 7 Hll-valle y 00 Landsat 36 Sonar 60 WDBC 30 Number of selected 8.4 ± ± ± ± ± ± ± ± ± ±.33 IFS rate(%) Number of selected 7.9 ± ± ± ± ± ± ± ± ± ± 0.99 GA + SVM rate(%) Fg. 2a and Fg. 2b show the global best classfcaton accuraces wth dfferent teratons on Australan and German datasets usng IFS, PSO+SVM, GA+SVM respectvely. Fg. 2e and Fg. 2f show the local best classfcaton accuraces wth dfferent teratons on Australan and German datasets usng IFS, PSO+SVM and GA+SVM respectvely. The convergence speeds of PSO+SVM and GA +SVM were faster than IFS, whereas the resultant classfcaton accuraces of PSO+SVM and GA+SVM were lower than IFS. Moreover, PSO+SVM and GA+SVM prematurely converged to local optmum, and thus t convnces that IFS has more exploraton capablty. The numbers of selected wth evoluton on German and Australan datasets usng three methods are shown n Fg. 3 and Fg. 4 respectvely. Fg. 2c and Fg. 2d show the number

8 8 Journal of Bonc Engneerng (20) Vol.8 No.2 of sub-swarms wth dfferent teratons on Australan feature selecton n terms of number of selected and German datasets usng IFS. Wth dfferent numbers and average classfcaton accuracy rates s shown n of ntal sub-swarms, a great number of sub-swarms Table 5. For comparson purpose, we mplemented the were reduced, and only a small number of sub-swarms PSO+SVM approach usng the standard PSO algorthm, were remaned at the fnal teraton. Most of the week and the parameter settngs were descrbed as follows: sub-swarms are elmnated durng the evoluton, and teraton sze was set as 500, number of partcles as 00. thus t can be seen that excellent sub-swarms are preserved after competton, as enhance the exploraton jectve functon. The analytcal results reveal that IFS The classfcaton accuracy rate was adopted as the ob- ablty of the whole swarm to obtan more mportant wth feature selecton performs sgnfcantly superor to. the standard PSO wth feature selecton n all datasets n The comparson between IFS and PSO+SVM usng terms of the classfcaton accuracy rates. Fg. 2 Predcton accuraces and number of sub-swarm wth dfferent teratons. (a) Global best accuraces wth dfferent teratons on Australan dataset usng IFS, PSO+SVM and GA+SVM. (b) Global best accuraces wth dfferent teratons on German dataset usng IFS, PSO+SVM and GA+SVM. (c) Each curve correspondng to a number of ntal sub-swarms on Australan dataset usng IFS. (d) Each curve correspondng to a number of ntal sub-swarms on German dataset usng IFS. (e) Local best accuraces wth dfferent teratons on Australan dataset usng IFS, PSO+SVM and GA+SVM. (f) Local best accuraces wth dfferent teratons on German dataset usng IFS, PSO+SVM and GA+SVM. Fg. 3 Number of selected wth dfferent teratons on Australan dataset usng IFS, PSO+SVM and GA+SVM. Fg. 4 Number of selected wth dfferent teratons on German dataset usng IFS, PSO+SVM and GA+SVM.

9 Lu et al.: An Improved Partcle Swarm Optmzaton for Feature Selecton 9 Table 5 Comparson between the IFS and standard PSO Dataset Number of orgnal Number of selected IFS rate (%) Number of selected PSO+SVM rate (%) Australan ± ± German ± ± Cleveland heart 3 6. ± ± Breast cancer ± ± Heart dsease ± ± Vehcle 7 7. ± ± Hll-Valley ± ± Landsat 36 3 ± ± Sonar ± ± WDBC 30 3 ± ± Concluson In ths study, a novel mult-swarm MSPSO algorthm s proposed to solve dscrete problem, an effcent objectve functon of whch s desgned by takng nto consderaton classfcaton accuracy rate and F-score. In order to descrbe the competton among the swarms, we ntroduced the mechansm for survval of the fttest. To further settle the feature selecton problem, we put forward the IFS approach, n whch both the SVM parameter optmzaton and the feature selecton are dynamcally executed by MSPSO algorthm, then, SVM model performs the classfcaton tasks usng the optmal parameter values and the subset of. The evaluaton on the 0 benchmark problems by comparng wth the standard PSO based, genetc algorthm based, and grd search based methods ndcates that the proposed approach performs sgnfcantly advantageously over others n terms of the classfcaton accuracy rates. Acknowledgments Ths work was supported by Natonal Natural Scence Foundaton of Chna (Grant no ), Natonal Electronc Development Foundaton of Chna (Grant no ), Jln Provnce Scence and Technology Department Project of Chna (Grant no ). References [] Guyon I, Elsseeff A. An ntroducton to varable and feature selecton. Journal of Machne Learnng Research, 2003, 3, [2] Kennedy J, Eberhart R. Partcle swarm optmzaton. Proceedngs of the IEEE Internatonal Conference on Neural Network, Perth, Australa, 995, [3] Eberhart R, Kennedy J. A new optmzer usng partcle swarm theory. Proceedngs of the Sxth Internatonal Symposum on Mcro Machne and Human Scence, Nagoya, Japan, 995, [4] Ln S W, Yng K C, Chen S C, Lee Z J. Partcle swarm optmzaton for parameter determnaton and feature selecton of support vector machnes. Expert Systems wth Applcatons, 2008, 35, [5] Huang C L, Dun J F. A dstrbuted PSO-SVM hybrd system wth feature selecton and parameter optmzaton. Appled Soft Computng, 2008, 8, [6] Blackwell T. Partcle swarms and populaton dversty. Soft Computng, 2005, 9, [7] Blackwell T, Branke J. Multswarms, excluson, and ant-convergence n dynamc envronments. IEEE Transactons on Evolutonary Computaton, 2006, 0, [8] Parrott D, L X D. Locatng and trackng multple dynamc optma by a partcle swarm model usng specaton. IEEE Transactons on Evolutonary Computaton, 2006, 0, [9] Nu B, Zhu Y L, He X X, Wu H. MCPSO: A mult-swarm cooperatve partcle swarm optmzer. Appled Mathematcs and Computaton, 2007, 85, [0] Chen Y W, Ln C J. Combnaton of feature selecton approaches wth SVM n credt scorng. Expert Systems wth Applcatons, 2006, 37, [] Kennedy J, Eberhart R. A dscrete bnary verson of the partcle swarm algorthm. Proceedngs of the IEEE Internatonal Conference on Systems, Man and Cybernetcs, Orlando, USA, 997, [2] Vapnk V N. The Nature of Statstcal Learnng Theory, 2nd ed, Sprnger, New York, 999. [3] Boser B E, Guyon I M, Vapnk V N. A tranng algorthm for optmal margn classfers. Proceedngs of the ffth Annual Workshop on Computatonal Learnng Theory, Pttsburgh, USA, 992, [4] Clerc M, Kennedy J. The partcle swarm-exploson, stablty, and convergence n a multdmensonal complex space. IEEE Transactons on Evolutonary Computaton, 2002, 6, [5] Trelea I C. The partcle swarm optmzaton algorthm:

10 0 Journal of Bonc Engneerng (20) Vol.8 No.2 convergence analyss and parameter selecton. Informaton Processng Letters, 2003, 85, [6] Kadrkamanathan V, Selvarajah K, Flemng P J. Stablty analyss of the partcle dynamcs n partcle swarm optmzer. IEEE Transactons on Evolutonary Computaton, 2006, 0, [7] Jang M, Luo Y P, Yang S Y. Stochastc convergence analyss and parameter selecton of the standard partcle swarm optmzaton algorthm. Informaton Processng Letters, 2007, 02, 8 6. [8] Sh Y, Eberhart R. Modfed partcle swarm optmzer. Proceedngs of IEEE Internatonal Conference on Evolutonary Computaton, Anchorage, USA, 998, [9] Zhan Z H, Zhang J, L Y. Adaptve Partcle Swarm Optmzaton. IEEE Transactons on Systems Man and Cybernetcs Part B-Cybernetcs, 2009, 39, [20] Xe J Y, Wang C X. Usng support vector machnes wth a novel hybrd feature selecton method for dagnoss of erythemato-squamous dseases. Expert Systems wth Applcatons, 20, 38,

The Research of Support Vector Machine in Agricultural Data Classification

The Research of Support Vector Machine in Agricultural Data Classification The Research of Support Vector Machne n Agrcultural Data Classfcaton Le Sh, Qguo Duan, Xnmng Ma, Me Weng College of Informaton and Management Scence, HeNan Agrcultural Unversty, Zhengzhou 45000 Chna Zhengzhou

More information

Learning the Kernel Parameters in Kernel Minimum Distance Classifier

Learning the Kernel Parameters in Kernel Minimum Distance Classifier Learnng the Kernel Parameters n Kernel Mnmum Dstance Classfer Daoqang Zhang 1,, Songcan Chen and Zh-Hua Zhou 1* 1 Natonal Laboratory for Novel Software Technology Nanjng Unversty, Nanjng 193, Chna Department

More information

Determining the Optimal Bandwidth Based on Multi-criterion Fusion

Determining the Optimal Bandwidth Based on Multi-criterion Fusion Proceedngs of 01 4th Internatonal Conference on Machne Learnng and Computng IPCSIT vol. 5 (01) (01) IACSIT Press, Sngapore Determnng the Optmal Bandwdth Based on Mult-crteron Fuson Ha-L Lang 1+, Xan-Mn

More information

Problem Definitions and Evaluation Criteria for Computational Expensive Optimization

Problem Definitions and Evaluation Criteria for Computational Expensive Optimization Problem efntons and Evaluaton Crtera for Computatonal Expensve Optmzaton B. Lu 1, Q. Chen and Q. Zhang 3, J. J. Lang 4, P. N. Suganthan, B. Y. Qu 6 1 epartment of Computng, Glyndwr Unversty, UK Faclty

More information

Network Intrusion Detection Based on PSO-SVM

Network Intrusion Detection Based on PSO-SVM TELKOMNIKA Indonesan Journal of Electrcal Engneerng Vol.1, No., February 014, pp. 150 ~ 1508 DOI: http://dx.do.org/10.11591/telkomnka.v1.386 150 Network Intruson Detecton Based on PSO-SVM Changsheng Xang*

More information

Parallelism for Nested Loops with Non-uniform and Flow Dependences

Parallelism for Nested Loops with Non-uniform and Flow Dependences Parallelsm for Nested Loops wth Non-unform and Flow Dependences Sam-Jn Jeong Dept. of Informaton & Communcaton Engneerng, Cheonan Unversty, 5, Anseo-dong, Cheonan, Chungnam, 330-80, Korea. seong@cheonan.ac.kr

More information

Clustering Algorithm Combining CPSO with K-Means Chunqin Gu 1, a, Qian Tao 2, b

Clustering Algorithm Combining CPSO with K-Means Chunqin Gu 1, a, Qian Tao 2, b Internatonal Conference on Advances n Mechancal Engneerng and Industral Informatcs (AMEII 05) Clusterng Algorthm Combnng CPSO wth K-Means Chunqn Gu, a, Qan Tao, b Department of Informaton Scence, Zhongka

More information

Support Vector Machines

Support Vector Machines /9/207 MIST.6060 Busness Intellgence and Data Mnng What are Support Vector Machnes? Support Vector Machnes Support Vector Machnes (SVMs) are supervsed learnng technques that analyze data and recognze patterns.

More information

Research of Neural Network Classifier Based on FCM and PSO for Breast Cancer Classification

Research of Neural Network Classifier Based on FCM and PSO for Breast Cancer Classification Research of Neural Network Classfer Based on FCM and PSO for Breast Cancer Classfcaton Le Zhang 1, Ln Wang 1, Xujewen Wang 2, Keke Lu 2, and Ajth Abraham 3 1 Shandong Provncal Key Laboratory of Network

More information

BOOSTING CLASSIFICATION ACCURACY WITH SAMPLES CHOSEN FROM A VALIDATION SET

BOOSTING CLASSIFICATION ACCURACY WITH SAMPLES CHOSEN FROM A VALIDATION SET 1 BOOSTING CLASSIFICATION ACCURACY WITH SAMPLES CHOSEN FROM A VALIDATION SET TZU-CHENG CHUANG School of Electrcal and Computer Engneerng, Purdue Unversty, West Lafayette, Indana 47907 SAUL B. GELFAND School

More information

Classifier Selection Based on Data Complexity Measures *

Classifier Selection Based on Data Complexity Measures * Classfer Selecton Based on Data Complexty Measures * Edth Hernández-Reyes, J.A. Carrasco-Ochoa, and J.Fco. Martínez-Trndad Natonal Insttute for Astrophyscs, Optcs and Electroncs, Lus Enrque Erro No.1 Sta.

More information

Recommended Items Rating Prediction based on RBF Neural Network Optimized by PSO Algorithm

Recommended Items Rating Prediction based on RBF Neural Network Optimized by PSO Algorithm Recommended Items Ratng Predcton based on RBF Neural Network Optmzed by PSO Algorthm Chengfang Tan, Cayn Wang, Yuln L and Xx Q Abstract In order to mtgate the data sparsty and cold-start problems of recommendaton

More information

Sum of Linear and Fractional Multiobjective Programming Problem under Fuzzy Rules Constraints

Sum of Linear and Fractional Multiobjective Programming Problem under Fuzzy Rules Constraints Australan Journal of Basc and Appled Scences, 2(4): 1204-1208, 2008 ISSN 1991-8178 Sum of Lnear and Fractonal Multobjectve Programmng Problem under Fuzzy Rules Constrants 1 2 Sanjay Jan and Kalash Lachhwan

More information

Meta-heuristics for Multidimensional Knapsack Problems

Meta-heuristics for Multidimensional Knapsack Problems 2012 4th Internatonal Conference on Computer Research and Development IPCSIT vol.39 (2012) (2012) IACSIT Press, Sngapore Meta-heurstcs for Multdmensonal Knapsack Problems Zhbao Man + Computer Scence Department,

More information

Complexity Analysis of Problem-Dimension Using PSO

Complexity Analysis of Problem-Dimension Using PSO Proceedngs of the 7th WSEAS Internatonal Conference on Evolutonary Computng, Cavtat, Croata, June -4, 6 (pp45-5) Complexty Analyss of Problem-Dmenson Usng PSO BUTHAINAH S. AL-KAZEMI AND SAMI J. HABIB,

More information

A Notable Swarm Approach to Evolve Neural Network for Classification in Data Mining

A Notable Swarm Approach to Evolve Neural Network for Classification in Data Mining A Notable Swarm Approach to Evolve Neural Network for Classfcaton n Data Mnng Satchdananda Dehur 1, Bjan Bhar Mshra 2 and Sung-Bae Cho 1 1 Soft Computng Laboratory, Department of Computer Scence, Yonse

More information

Classification / Regression Support Vector Machines

Classification / Regression Support Vector Machines Classfcaton / Regresson Support Vector Machnes Jeff Howbert Introducton to Machne Learnng Wnter 04 Topcs SVM classfers for lnearly separable classes SVM classfers for non-lnearly separable classes SVM

More information

CHAPTER 2 PROPOSED IMPROVED PARTICLE SWARM OPTIMIZATION

CHAPTER 2 PROPOSED IMPROVED PARTICLE SWARM OPTIMIZATION 24 CHAPTER 2 PROPOSED IMPROVED PARTICLE SWARM OPTIMIZATION The present chapter proposes an IPSO approach for multprocessor task schedulng problem wth two classfcatons, namely, statc ndependent tasks and

More information

Classifier Swarms for Human Detection in Infrared Imagery

Classifier Swarms for Human Detection in Infrared Imagery Classfer Swarms for Human Detecton n Infrared Imagery Yur Owechko, Swarup Medasan, and Narayan Srnvasa HRL Laboratores, LLC 3011 Malbu Canyon Road, Malbu, CA 90265 {owechko, smedasan, nsrnvasa}@hrl.com

More information

NUMERICAL SOLVING OPTIMAL CONTROL PROBLEMS BY THE METHOD OF VARIATIONS

NUMERICAL SOLVING OPTIMAL CONTROL PROBLEMS BY THE METHOD OF VARIATIONS ARPN Journal of Engneerng and Appled Scences 006-017 Asan Research Publshng Network (ARPN). All rghts reserved. NUMERICAL SOLVING OPTIMAL CONTROL PROBLEMS BY THE METHOD OF VARIATIONS Igor Grgoryev, Svetlana

More information

Analysis of Particle Swarm Optimization and Genetic Algorithm based on Task Scheduling in Cloud Computing Environment

Analysis of Particle Swarm Optimization and Genetic Algorithm based on Task Scheduling in Cloud Computing Environment Analyss of Partcle Swarm Optmzaton and Genetc Algorthm based on Tas Schedulng n Cloud Computng Envronment Frederc Nzanywayngoma School of Computer and Communcaton Engneerng Unversty of Scence and Technology

More information

An Iterative Solution Approach to Process Plant Layout using Mixed Integer Optimisation

An Iterative Solution Approach to Process Plant Layout using Mixed Integer Optimisation 17 th European Symposum on Computer Aded Process Engneerng ESCAPE17 V. Plesu and P.S. Agach (Edtors) 2007 Elsever B.V. All rghts reserved. 1 An Iteratve Soluton Approach to Process Plant Layout usng Mxed

More information

Journal of Chemical and Pharmaceutical Research, 2014, 6(6): Research Article. A selective ensemble classification method on microarray data

Journal of Chemical and Pharmaceutical Research, 2014, 6(6): Research Article. A selective ensemble classification method on microarray data Avalable onlne www.ocpr.com Journal of Chemcal and Pharmaceutcal Research, 2014, 6(6):2860-2866 Research Artcle ISSN : 0975-7384 CODEN(USA) : JCPRC5 A selectve ensemble classfcaton method on mcroarray

More information

Maximum Variance Combined with Adaptive Genetic Algorithm for Infrared Image Segmentation

Maximum Variance Combined with Adaptive Genetic Algorithm for Infrared Image Segmentation Internatonal Conference on Logstcs Engneerng, Management and Computer Scence (LEMCS 5) Maxmum Varance Combned wth Adaptve Genetc Algorthm for Infrared Image Segmentaton Huxuan Fu College of Automaton Harbn

More information

A Clustering Algorithm Solution to the Collaborative Filtering

A Clustering Algorithm Solution to the Collaborative Filtering Internatonal Journal of Scence Vol.4 No.8 017 ISSN: 1813-4890 A Clusterng Algorthm Soluton to the Collaboratve Flterng Yongl Yang 1, a, Fe Xue, b, Yongquan Ca 1, c Zhenhu Nng 1, d,* Hafeng Lu 3, e 1 Faculty

More information

Support Vector Machines

Support Vector Machines Support Vector Machnes Decson surface s a hyperplane (lne n 2D) n feature space (smlar to the Perceptron) Arguably, the most mportant recent dscovery n machne learnng In a nutshell: map the data to a predetermned

More information

Outline. Discriminative classifiers for image recognition. Where in the World? A nearest neighbor recognition example 4/14/2011. CS 376 Lecture 22 1

Outline. Discriminative classifiers for image recognition. Where in the World? A nearest neighbor recognition example 4/14/2011. CS 376 Lecture 22 1 4/14/011 Outlne Dscrmnatve classfers for mage recognton Wednesday, Aprl 13 Krsten Grauman UT-Austn Last tme: wndow-based generc obect detecton basc ppelne face detecton wth boostng as case study Today:

More information

Term Weighting Classification System Using the Chi-square Statistic for the Classification Subtask at NTCIR-6 Patent Retrieval Task

Term Weighting Classification System Using the Chi-square Statistic for the Classification Subtask at NTCIR-6 Patent Retrieval Task Proceedngs of NTCIR-6 Workshop Meetng, May 15-18, 2007, Tokyo, Japan Term Weghtng Classfcaton System Usng the Ch-square Statstc for the Classfcaton Subtask at NTCIR-6 Patent Retreval Task Kotaro Hashmoto

More information

Content Based Image Retrieval Using 2-D Discrete Wavelet with Texture Feature with Different Classifiers

Content Based Image Retrieval Using 2-D Discrete Wavelet with Texture Feature with Different Classifiers IOSR Journal of Electroncs and Communcaton Engneerng (IOSR-JECE) e-issn: 78-834,p- ISSN: 78-8735.Volume 9, Issue, Ver. IV (Mar - Apr. 04), PP 0-07 Content Based Image Retreval Usng -D Dscrete Wavelet wth

More information

An Evolvable Clustering Based Algorithm to Learn Distance Function for Supervised Environment

An Evolvable Clustering Based Algorithm to Learn Distance Function for Supervised Environment IJCSI Internatonal Journal of Computer Scence Issues, Vol. 7, Issue 5, September 2010 ISSN (Onlne): 1694-0814 www.ijcsi.org 374 An Evolvable Clusterng Based Algorthm to Learn Dstance Functon for Supervsed

More information

Optimizing SVR using Local Best PSO for Software Effort Estimation

Optimizing SVR using Local Best PSO for Software Effort Estimation Journal of Informaton Technology and Computer Scence Volume 1, Number 1, 2016, pp. 28 37 Journal Homepage: www.jtecs.ub.ac.d Optmzng SVR usng Local Best PSO for Software Effort Estmaton Dnda Novtasar 1,

More information

Smoothing Spline ANOVA for variable screening

Smoothing Spline ANOVA for variable screening Smoothng Splne ANOVA for varable screenng a useful tool for metamodels tranng and mult-objectve optmzaton L. Rcco, E. Rgon, A. Turco Outlne RSM Introducton Possble couplng Test case MOO MOO wth Game Theory

More information

Data Mining For Multi-Criteria Energy Predictions

Data Mining For Multi-Criteria Energy Predictions Data Mnng For Mult-Crtera Energy Predctons Kashf Gll and Denns Moon Abstract We present a data mnng technque for mult-crtera predctons of wnd energy. A mult-crtera (MC) evolutonary computng method has

More information

An Optimal Algorithm for Prufer Codes *

An Optimal Algorithm for Prufer Codes * J. Software Engneerng & Applcatons, 2009, 2: 111-115 do:10.4236/jsea.2009.22016 Publshed Onlne July 2009 (www.scrp.org/journal/jsea) An Optmal Algorthm for Prufer Codes * Xaodong Wang 1, 2, Le Wang 3,

More information

THE PATH PLANNING ALGORITHM AND SIMULATION FOR MOBILE ROBOT

THE PATH PLANNING ALGORITHM AND SIMULATION FOR MOBILE ROBOT Journal of Theoretcal and Appled Informaton Technology 30 th Aprl 013. Vol. 50 No.3 005-013 JATIT & LLS. All rghts reserved. ISSN: 199-8645 www.jatt.org E-ISSN: 1817-3195 THE PATH PLANNING ALGORITHM AND

More information

Comparison of Heuristics for Scheduling Independent Tasks on Heterogeneous Distributed Environments

Comparison of Heuristics for Scheduling Independent Tasks on Heterogeneous Distributed Environments Comparson of Heurstcs for Schedulng Independent Tasks on Heterogeneous Dstrbuted Envronments Hesam Izakan¹, Ath Abraham², Senor Member, IEEE, Václav Snášel³ ¹ Islamc Azad Unversty, Ramsar Branch, Ramsar,

More information

A fast algorithm for color image segmentation

A fast algorithm for color image segmentation Unersty of Wollongong Research Onlne Faculty of Informatcs - Papers (Arche) Faculty of Engneerng and Informaton Scences 006 A fast algorthm for color mage segmentaton L. Dong Unersty of Wollongong, lju@uow.edu.au

More information

CHAPTER 4 OPTIMIZATION TECHNIQUES

CHAPTER 4 OPTIMIZATION TECHNIQUES 48 CHAPTER 4 OPTIMIZATION TECHNIQUES 4.1 INTRODUCTION Unfortunately no sngle optmzaton algorthm exsts that can be appled effcently to all types of problems. The method chosen for any partcular case wll

More information

A Binarization Algorithm specialized on Document Images and Photos

A Binarization Algorithm specialized on Document Images and Photos A Bnarzaton Algorthm specalzed on Document mages and Photos Ergna Kavalleratou Dept. of nformaton and Communcaton Systems Engneerng Unversty of the Aegean kavalleratou@aegean.gr Abstract n ths paper, a

More information

Optimal Design of Nonlinear Fuzzy Model by Means of Independent Fuzzy Scatter Partition

Optimal Design of Nonlinear Fuzzy Model by Means of Independent Fuzzy Scatter Partition Optmal Desgn of onlnear Fuzzy Model by Means of Independent Fuzzy Scatter Partton Keon-Jun Park, Hyung-Kl Kang and Yong-Kab Km *, Department of Informaton and Communcaton Engneerng, Wonkwang Unversty,

More information

A Fast Visual Tracking Algorithm Based on Circle Pixels Matching

A Fast Visual Tracking Algorithm Based on Circle Pixels Matching A Fast Vsual Trackng Algorthm Based on Crcle Pxels Matchng Zhqang Hou hou_zhq@sohu.com Chongzhao Han czhan@mal.xjtu.edu.cn Ln Zheng Abstract: A fast vsual trackng algorthm based on crcle pxels matchng

More information

Natural Computing. Lecture 13: Particle swarm optimisation INFR /11/2010

Natural Computing. Lecture 13: Particle swarm optimisation INFR /11/2010 Natural Computng Lecture 13: Partcle swarm optmsaton Mchael Herrmann mherrman@nf.ed.ac.uk phone: 0131 6 517177 Informatcs Forum 1.42 INFR09038 5/11/2010 Swarm ntellgence Collectve ntellgence: A super-organsm

More information

Using Particle Swarm Optimization for Enhancing the Hierarchical Cell Relay Routing Protocol

Using Particle Swarm Optimization for Enhancing the Hierarchical Cell Relay Routing Protocol 2012 Thrd Internatonal Conference on Networkng and Computng Usng Partcle Swarm Optmzaton for Enhancng the Herarchcal Cell Relay Routng Protocol Hung-Y Ch Department of Electrcal Engneerng Natonal Sun Yat-Sen

More information

Study of Data Stream Clustering Based on Bio-inspired Model

Study of Data Stream Clustering Based on Bio-inspired Model , pp.412-418 http://dx.do.org/10.14257/astl.2014.53.86 Study of Data Stream lusterng Based on Bo-nspred Model Yngme L, Mn L, Jngbo Shao, Gaoyang Wang ollege of omputer Scence and Informaton Engneerng,

More information

Edge Detection in Noisy Images Using the Support Vector Machines

Edge Detection in Noisy Images Using the Support Vector Machines Edge Detecton n Nosy Images Usng the Support Vector Machnes Hlaro Gómez-Moreno, Saturnno Maldonado-Bascón, Francsco López-Ferreras Sgnal Theory and Communcatons Department. Unversty of Alcalá Crta. Madrd-Barcelona

More information

Performance Evaluation of Information Retrieval Systems

Performance Evaluation of Information Retrieval Systems Why System Evaluaton? Performance Evaluaton of Informaton Retreval Systems Many sldes n ths secton are adapted from Prof. Joydeep Ghosh (UT ECE) who n turn adapted them from Prof. Dk Lee (Unv. of Scence

More information

Available online at Available online at Advanced in Control Engineering and Information Science

Available online at   Available online at   Advanced in Control Engineering and Information Science Avalable onlne at wwwscencedrectcom Avalable onlne at wwwscencedrectcom Proceda Proceda Engneerng Engneerng 00 (2011) 15000 000 (2011) 1642 1646 Proceda Engneerng wwwelsevercom/locate/proceda Advanced

More information

FINDING IMPORTANT NODES IN SOCIAL NETWORKS BASED ON MODIFIED PAGERANK

FINDING IMPORTANT NODES IN SOCIAL NETWORKS BASED ON MODIFIED PAGERANK FINDING IMPORTANT NODES IN SOCIAL NETWORKS BASED ON MODIFIED PAGERANK L-qng Qu, Yong-quan Lang 2, Jng-Chen 3, 2 College of Informaton Scence and Technology, Shandong Unversty of Scence and Technology,

More information

Training ANFIS Structure with Modified PSO Algorithm

Training ANFIS Structure with Modified PSO Algorithm Proceedngs of the 5th Medterranean Conference on Control & Automaton, July 7-9, 007, Athens - Greece T4-003 Tranng ANFIS Structure wth Modfed PSO Algorthm V.Seyd Ghomsheh *, M. Alyar Shoorehdel **, M.

More information

A mathematical programming approach to the analysis, design and scheduling of offshore oilfields

A mathematical programming approach to the analysis, design and scheduling of offshore oilfields 17 th European Symposum on Computer Aded Process Engneerng ESCAPE17 V. Plesu and P.S. Agach (Edtors) 2007 Elsever B.V. All rghts reserved. 1 A mathematcal programmng approach to the analyss, desgn and

More information

A Modified Median Filter for the Removal of Impulse Noise Based on the Support Vector Machines

A Modified Median Filter for the Removal of Impulse Noise Based on the Support Vector Machines A Modfed Medan Flter for the Removal of Impulse Nose Based on the Support Vector Machnes H. GOMEZ-MORENO, S. MALDONADO-BASCON, F. LOPEZ-FERRERAS, M. UTRILLA- MANSO AND P. GIL-JIMENEZ Departamento de Teoría

More information

Image Feature Selection Based on Ant Colony Optimization

Image Feature Selection Based on Ant Colony Optimization Image Feature Selecton Based on Ant Colony Optmzaton Lng Chen,2, Bolun Chen, Yxn Chen 3, Department of Computer Scence, Yangzhou Unversty,Yangzhou, Chna 2 State Key Lab of Novel Software Tech, Nanng Unversty,

More information

ARTICLE IN PRESS. Applied Soft Computing xxx (2012) xxx xxx. Contents lists available at SciVerse ScienceDirect. Applied Soft Computing

ARTICLE IN PRESS. Applied Soft Computing xxx (2012) xxx xxx. Contents lists available at SciVerse ScienceDirect. Applied Soft Computing ASOC-11; o. of Pages 1 Appled Soft Computng xxx (1) xxx xxx Contents lsts avalable at ScVerse ScenceDrect Appled Soft Computng j ourna l ho mepage: www.elsever.com/locate/asoc A herarchcal partcle swarm

More information

An Adaptive Multi-population Artificial Bee Colony Algorithm for Dynamic Optimisation Problems

An Adaptive Multi-population Artificial Bee Colony Algorithm for Dynamic Optimisation Problems *Revsed Manuscrpt (changes marked) Clck here to vew lnked References An Adaptve Mult-populaton Artfcal Bee Colony Algorthm for Dynamc Optmsaton Problems Shams K. Nseef 1, Salwan Abdullah 1, Ayad Turky

More information

Positive Semi-definite Programming Localization in Wireless Sensor Networks

Positive Semi-definite Programming Localization in Wireless Sensor Networks Postve Sem-defnte Programmng Localzaton n Wreless Sensor etworks Shengdong Xe 1,, Jn Wang, Aqun Hu 1, Yunl Gu, Jang Xu, 1 School of Informaton Scence and Engneerng, Southeast Unversty, 10096, anjng Computer

More information

Machine Learning. Support Vector Machines. (contains material adapted from talks by Constantin F. Aliferis & Ioannis Tsamardinos, and Martin Law)

Machine Learning. Support Vector Machines. (contains material adapted from talks by Constantin F. Aliferis & Ioannis Tsamardinos, and Martin Law) Machne Learnng Support Vector Machnes (contans materal adapted from talks by Constantn F. Alfers & Ioanns Tsamardnos, and Martn Law) Bryan Pardo, Machne Learnng: EECS 349 Fall 2014 Support Vector Machnes

More information

Chinese Word Segmentation based on the Improved Particle Swarm Optimization Neural Networks

Chinese Word Segmentation based on the Improved Particle Swarm Optimization Neural Networks Chnese Word Segmentaton based on the Improved Partcle Swarm Optmzaton Neural Networks Ja He Computatonal Intellgence Laboratory School of Computer Scence and Engneerng, UESTC Chengdu, Chna Department of

More information

CS246: Mining Massive Datasets Jure Leskovec, Stanford University

CS246: Mining Massive Datasets Jure Leskovec, Stanford University CS46: Mnng Massve Datasets Jure Leskovec, Stanford Unversty http://cs46.stanford.edu /19/013 Jure Leskovec, Stanford CS46: Mnng Massve Datasets, http://cs46.stanford.edu Perceptron: y = sgn( x Ho to fnd

More information

Parameters Optimization of SVM Based on Improved FOA and Its Application in Fault Diagnosis

Parameters Optimization of SVM Based on Improved FOA and Its Application in Fault Diagnosis Parameters Optmzaton of SVM Based on Improved FOA and Its Applcaton n Fault Dagnoss Qantu Zhang1*, Lqng Fang1, Sca Su, Yan Lv1 1 Frst Department, Mechancal Engneerng College, Shjazhuang, Hebe Provnce,

More information

Spam Filtering Based on Support Vector Machines with Taguchi Method for Parameter Selection

Spam Filtering Based on Support Vector Machines with Taguchi Method for Parameter Selection E-mal Spam Flterng Based on Support Vector Machnes wth Taguch Method for Parameter Selecton We-Chh Hsu, Tsan-Yng Yu E-mal Spam Flterng Based on Support Vector Machnes wth Taguch Method for Parameter Selecton

More information

Cluster Analysis of Electrical Behavior

Cluster Analysis of Electrical Behavior Journal of Computer and Communcatons, 205, 3, 88-93 Publshed Onlne May 205 n ScRes. http://www.scrp.org/ournal/cc http://dx.do.org/0.4236/cc.205.350 Cluster Analyss of Electrcal Behavor Ln Lu Ln Lu, School

More information

Classifying Acoustic Transient Signals Using Artificial Intelligence

Classifying Acoustic Transient Signals Using Artificial Intelligence Classfyng Acoustc Transent Sgnals Usng Artfcal Intellgence Steve Sutton, Unversty of North Carolna At Wlmngton (suttons@charter.net) Greg Huff, Unversty of North Carolna At Wlmngton (jgh7476@uncwl.edu)

More information

EVALUATION OF THE PERFORMANCES OF ARTIFICIAL BEE COLONY AND INVASIVE WEED OPTIMIZATION ALGORITHMS ON THE MODIFIED BENCHMARK FUNCTIONS

EVALUATION OF THE PERFORMANCES OF ARTIFICIAL BEE COLONY AND INVASIVE WEED OPTIMIZATION ALGORITHMS ON THE MODIFIED BENCHMARK FUNCTIONS Academc Research Internatonal ISS-L: 3-9553, ISS: 3-9944 Vol., o. 3, May 0 EVALUATIO OF THE PERFORMACES OF ARTIFICIAL BEE COLOY AD IVASIVE WEED OPTIMIZATIO ALGORITHMS O THE MODIFIED BECHMARK FUCTIOS Dlay

More information

An Influence of the Noise on the Imaging Algorithm in the Electrical Impedance Tomography *

An Influence of the Noise on the Imaging Algorithm in the Electrical Impedance Tomography * Open Journal of Bophyscs, 3, 3, 7- http://dx.do.org/.436/ojbphy.3.347 Publshed Onlne October 3 (http://www.scrp.org/journal/ojbphy) An Influence of the Nose on the Imagng Algorthm n the Electrcal Impedance

More information

THE CONDENSED FUZZY K-NEAREST NEIGHBOR RULE BASED ON SAMPLE FUZZY ENTROPY

THE CONDENSED FUZZY K-NEAREST NEIGHBOR RULE BASED ON SAMPLE FUZZY ENTROPY Proceedngs of the 20 Internatonal Conference on Machne Learnng and Cybernetcs, Guln, 0-3 July, 20 THE CONDENSED FUZZY K-NEAREST NEIGHBOR RULE BASED ON SAMPLE FUZZY ENTROPY JUN-HAI ZHAI, NA LI, MENG-YAO

More information

PARETO BAYESIAN OPTIMIZATION ALGORITHM FOR THE MULTIOBJECTIVE 0/1 KNAPSACK PROBLEM

PARETO BAYESIAN OPTIMIZATION ALGORITHM FOR THE MULTIOBJECTIVE 0/1 KNAPSACK PROBLEM PARETO BAYESIAN OPTIMIZATION ALGORITHM FOR THE MULTIOBJECTIVE 0/ KNAPSACK PROBLEM Josef Schwarz Jří Očenáše Brno Unversty of Technology Faculty of Engneerng and Computer Scence Department of Computer Scence

More information

Efficient Text Classification by Weighted Proximal SVM *

Efficient Text Classification by Weighted Proximal SVM * Effcent ext Classfcaton by Weghted Proxmal SVM * Dong Zhuang 1, Benyu Zhang, Qang Yang 3, Jun Yan 4, Zheng Chen, Yng Chen 1 1 Computer Scence and Engneerng, Bejng Insttute of echnology, Bejng 100081, Chna

More information

Support Vector Machines. CS534 - Machine Learning

Support Vector Machines. CS534 - Machine Learning Support Vector Machnes CS534 - Machne Learnng Perceptron Revsted: Lnear Separators Bnar classfcaton can be veed as the task of separatng classes n feature space: b > 0 b 0 b < 0 f() sgn( b) Lnear Separators

More information

A Time-driven Data Placement Strategy for a Scientific Workflow Combining Edge Computing and Cloud Computing

A Time-driven Data Placement Strategy for a Scientific Workflow Combining Edge Computing and Cloud Computing > REPLACE THIS LINE WITH YOUR PAPER IDENTIFICATION NUMBER (DOUBLE-CLICK HERE TO EDIT) < 1 A Tme-drven Data Placement Strategy for a Scentfc Workflow Combnng Edge Computng and Cloud Computng Bng Ln, Fangnng

More information

Tsinghua University at TAC 2009: Summarizing Multi-documents by Information Distance

Tsinghua University at TAC 2009: Summarizing Multi-documents by Information Distance Tsnghua Unversty at TAC 2009: Summarzng Mult-documents by Informaton Dstance Chong Long, Mnle Huang, Xaoyan Zhu State Key Laboratory of Intellgent Technology and Systems, Tsnghua Natonal Laboratory for

More information

An Indian Journal FULL PAPER ABSTRACT KEYWORDS. Trade Science Inc.

An Indian Journal FULL PAPER ABSTRACT KEYWORDS. Trade Science Inc. [Type text] [Type text] [Type text] ISSN : 97-735 Volume Issue 9 BoTechnology An Indan Journal FULL PAPER BTAIJ, (9), [333-3] Matlab mult-dmensonal model-based - 3 Chnese football assocaton super league

More information

BAYESIAN MULTI-SOURCE DOMAIN ADAPTATION

BAYESIAN MULTI-SOURCE DOMAIN ADAPTATION BAYESIAN MULTI-SOURCE DOMAIN ADAPTATION SHI-LIANG SUN, HONG-LEI SHI Department of Computer Scence and Technology, East Chna Normal Unversty 500 Dongchuan Road, Shangha 200241, P. R. Chna E-MAIL: slsun@cs.ecnu.edu.cn,

More information

A New Approach For the Ranking of Fuzzy Sets With Different Heights

A New Approach For the Ranking of Fuzzy Sets With Different Heights New pproach For the ankng of Fuzzy Sets Wth Dfferent Heghts Pushpnder Sngh School of Mathematcs Computer pplcatons Thapar Unversty, Patala-7 00 Inda pushpndersnl@gmalcom STCT ankng of fuzzy sets plays

More information

A Novel Deluge Swarm Algorithm for Optimization Problems

A Novel Deluge Swarm Algorithm for Optimization Problems A Novel eluge Swarm Algorthm for Optmzaton Problems Anahta Samad,* - Mohammad Reza Meybod Scence and Research Branch, Islamc Azad Unversty, Qazvn, Iran Soft Computng Laboratory, Computer Engneerng and

More information

K-means Optimization Clustering Algorithm Based on Hybrid PSO/GA Optimization and CS validity index

K-means Optimization Clustering Algorithm Based on Hybrid PSO/GA Optimization and CS validity index Orgnal Artcle Prnt ISSN: 3-6379 Onlne ISSN: 3-595X DOI: 0.7354/jss/07/33 K-means Optmzaton Clusterng Algorthm Based on Hybrd PSO/GA Optmzaton and CS valdty ndex K Jahanbn *, F Rahmanan, H Rezae 3, Y Farhang

More information

CHAPTER 3 SEQUENTIAL MINIMAL OPTIMIZATION TRAINED SUPPORT VECTOR CLASSIFIER FOR CANCER PREDICTION

CHAPTER 3 SEQUENTIAL MINIMAL OPTIMIZATION TRAINED SUPPORT VECTOR CLASSIFIER FOR CANCER PREDICTION 48 CHAPTER 3 SEQUENTIAL MINIMAL OPTIMIZATION TRAINED SUPPORT VECTOR CLASSIFIER FOR CANCER PREDICTION 3.1 INTRODUCTION The raw mcroarray data s bascally an mage wth dfferent colors ndcatng hybrdzaton (Xue

More information

Multi-objective Design Optimization of MCM Placement

Multi-objective Design Optimization of MCM Placement Proceedngs of the 5th WSEAS Int. Conf. on Instrumentaton, Measurement, Crcuts and Systems, Hangzhou, Chna, Aprl 6-8, 26 (pp56-6) Mult-objectve Desgn Optmzaton of MCM Placement Chng-Ma Ko ab, Yu-Jung Huang

More information

Keywords - Wep page classification; bag of words model; topic model; hierarchical classification; Support Vector Machines

Keywords - Wep page classification; bag of words model; topic model; hierarchical classification; Support Vector Machines (IJCSIS) Internatonal Journal of Computer Scence and Informaton Securty, Herarchcal Web Page Classfcaton Based on a Topc Model and Neghborng Pages Integraton Wongkot Srura Phayung Meesad Choochart Haruechayasak

More information

The Greedy Method. Outline and Reading. Change Money Problem. Greedy Algorithms. Applications of the Greedy Strategy. The Greedy Method Technique

The Greedy Method. Outline and Reading. Change Money Problem. Greedy Algorithms. Applications of the Greedy Strategy. The Greedy Method Technique //00 :0 AM Outlne and Readng The Greedy Method The Greedy Method Technque (secton.) Fractonal Knapsack Problem (secton..) Task Schedulng (secton..) Mnmum Spannng Trees (secton.) Change Money Problem Greedy

More information

Classifier Ensemble Design using Artificial Bee Colony based Feature Selection

Classifier Ensemble Design using Artificial Bee Colony based Feature Selection IJCSI Internatonal Journal of Computer Scence Issues, Vol. 9, Issue 3, No 2, May 2012 ISSN (Onlne): 1694-0814 www.ijcsi.org 522 Classfer Ensemble Desgn usng Artfcal Bee Colony based Feature Selecton Shunmugaprya

More information

USING MODIFIED FUZZY PARTICLE SWARM OPTIMIZATION ALGORITHM FOR PARAMETER ESTIMATION OF SURGE ARRESTERS MODELS

USING MODIFIED FUZZY PARTICLE SWARM OPTIMIZATION ALGORITHM FOR PARAMETER ESTIMATION OF SURGE ARRESTERS MODELS Internatonal Journal of Innovatve Computng, Informaton and Control ICIC Internatonal c 2012 ISSN 1349-4198 Volume 8, Number 1(B), January 2012 pp. 567 581 USING MODIFIED FUZZY PARTICLE SWARM OPTIMIZATION

More information

Optimizing Document Scoring for Query Retrieval

Optimizing Document Scoring for Query Retrieval Optmzng Document Scorng for Query Retreval Brent Ellwen baellwe@cs.stanford.edu Abstract The goal of ths project was to automate the process of tunng a document query engne. Specfcally, I used machne learnng

More information

Collaboratively Regularized Nearest Points for Set Based Recognition

Collaboratively Regularized Nearest Points for Set Based Recognition Academc Center for Computng and Meda Studes, Kyoto Unversty Collaboratvely Regularzed Nearest Ponts for Set Based Recognton Yang Wu, Mchhko Mnoh, Masayuk Mukunok Kyoto Unversty 9/1/013 BMVC 013 @ Brstol,

More information

Straight Line Detection Based on Particle Swarm Optimization

Straight Line Detection Based on Particle Swarm Optimization Sensors & ransducers 013 b IFSA http://www.sensorsportal.com Straght Lne Detecton Based on Partcle Swarm Optmzaton Shengzhou XU, Jun IE College of computer scence, South-Central Unverst for Natonaltes,

More information

Face Recognition Method Based on Within-class Clustering SVM

Face Recognition Method Based on Within-class Clustering SVM Face Recognton Method Based on Wthn-class Clusterng SVM Yan Wu, Xao Yao and Yng Xa Department of Computer Scence and Engneerng Tong Unversty Shangha, Chna Abstract - A face recognton method based on Wthn-class

More information

Incremental Learning with Support Vector Machines and Fuzzy Set Theory

Incremental Learning with Support Vector Machines and Fuzzy Set Theory The 25th Workshop on Combnatoral Mathematcs and Computaton Theory Incremental Learnng wth Support Vector Machnes and Fuzzy Set Theory Yu-Mng Chuang 1 and Cha-Hwa Ln 2* 1 Department of Computer Scence and

More information

S1 Note. Basis functions.

S1 Note. Basis functions. S1 Note. Bass functons. Contents Types of bass functons...1 The Fourer bass...2 B-splne bass...3 Power and type I error rates wth dfferent numbers of bass functons...4 Table S1. Smulaton results of type

More information

An Improved Image Segmentation Algorithm Based on the Otsu Method

An Improved Image Segmentation Algorithm Based on the Otsu Method 3th ACIS Internatonal Conference on Software Engneerng, Artfcal Intellgence, Networkng arallel/dstrbuted Computng An Improved Image Segmentaton Algorthm Based on the Otsu Method Mengxng Huang, enjao Yu,

More information

Using Neural Networks and Support Vector Machines in Data Mining

Using Neural Networks and Support Vector Machines in Data Mining Usng eural etworks and Support Vector Machnes n Data Mnng RICHARD A. WASIOWSKI Computer Scence Department Calforna State Unversty Domnguez Hlls Carson, CA 90747 USA Abstract: - Multvarate data analyss

More information

RESEARCH ON JOB-SHOP SCHEDULING PROBLEM BASED ON IMPROVED PARTICLE SWARM OPTIMIZATION

RESEARCH ON JOB-SHOP SCHEDULING PROBLEM BASED ON IMPROVED PARTICLE SWARM OPTIMIZATION Journal of heoretcal and Appled Informaton echnology 005-013 JAI & LLS. All rghts reserved. RESEARCH ON JOB-SHOP SCHEDULING PROBLEM BASED ON IMPROVED PARICLE SWARM OPIMIZAION 1 ZUFENG ZHONG 1 School of

More information

A Load-balancing and Energy-aware Clustering Algorithm in Wireless Ad-hoc Networks

A Load-balancing and Energy-aware Clustering Algorithm in Wireless Ad-hoc Networks A Load-balancng and Energy-aware Clusterng Algorthm n Wreless Ad-hoc Networks Wang Jn, Shu Le, Jnsung Cho, Young-Koo Lee, Sungyoung Lee, Yonl Zhong Department of Computer Engneerng Kyung Hee Unversty,

More information

Application of Improved Fish Swarm Algorithm in Cloud Computing Resource Scheduling

Application of Improved Fish Swarm Algorithm in Cloud Computing Resource Scheduling , pp.40-45 http://dx.do.org/10.14257/astl.2017.143.08 Applcaton of Improved Fsh Swarm Algorthm n Cloud Computng Resource Schedulng Yu Lu, Fangtao Lu School of Informaton Engneerng, Chongqng Vocatonal Insttute

More information

Japanese Dependency Analysis Based on Improved SVM and KNN

Japanese Dependency Analysis Based on Improved SVM and KNN Proceedngs of the 7th WSEAS Internatonal Conference on Smulaton, Modellng and Optmzaton, Bejng, Chna, September 15-17, 2007 140 Japanese Dependency Analyss Based on Improved SVM and KNN ZHOU HUIWEI and

More information

MULTIOBJECTIVE OPTIMIZATION USING PARALLEL VECTOR EVALUATED PARTICLE SWARM OPTIMIZATION

MULTIOBJECTIVE OPTIMIZATION USING PARALLEL VECTOR EVALUATED PARTICLE SWARM OPTIMIZATION MULTIOBJECTIVE OPTIMIZATION USING PARALLEL VECTOR EVALUATED PARTICLE OPTIMIZATION K.E. Parsopoulos, D.K. Tasouls, M.N. Vrahats Department of Mathematcs, Unversty of Patras Artfcal Intellgence Research

More information

Rule Discovery with Particle Swarm Optimization

Rule Discovery with Particle Swarm Optimization Rule Dscovery wth Partcle Swarm Optmzaton Yu Lu 1, Zheng Qn 1,2, Zhewen Sh 1, and Junyng Chen 1 1 Department of Computer Scence, Xan JaoTong Unversty, Xan 710049, P.R. Chna luyu@malst.xjtu.edu.cn http://www.psodream.net

More information

VISUAL SELECTION OF SURFACE FEATURES DURING THEIR GEOMETRIC SIMULATION WITH THE HELP OF COMPUTER TECHNOLOGIES

VISUAL SELECTION OF SURFACE FEATURES DURING THEIR GEOMETRIC SIMULATION WITH THE HELP OF COMPUTER TECHNOLOGIES UbCC 2011, Volume 6, 5002981-x manuscrpts OPEN ACCES UbCC Journal ISSN 1992-8424 www.ubcc.org VISUAL SELECTION OF SURFACE FEATURES DURING THEIR GEOMETRIC SIMULATION WITH THE HELP OF COMPUTER TECHNOLOGIES

More information

Design of Structure Optimization with APDL

Design of Structure Optimization with APDL Desgn of Structure Optmzaton wth APDL Yanyun School of Cvl Engneerng and Archtecture, East Chna Jaotong Unversty Nanchang 330013 Chna Abstract In ths paper, the desgn process of structure optmzaton wth

More information

The Shortest Path of Touring Lines given in the Plane

The Shortest Path of Touring Lines given in the Plane Send Orders for Reprnts to reprnts@benthamscence.ae 262 The Open Cybernetcs & Systemcs Journal, 2015, 9, 262-267 The Shortest Path of Tourng Lnes gven n the Plane Open Access Ljuan Wang 1,2, Dandan He

More information

Feature Reduction and Selection

Feature Reduction and Selection Feature Reducton and Selecton Dr. Shuang LIANG School of Software Engneerng TongJ Unversty Fall, 2012 Today s Topcs Introducton Problems of Dmensonalty Feature Reducton Statstc methods Prncpal Components

More information

UB at GeoCLEF Department of Geography Abstract

UB at GeoCLEF Department of Geography   Abstract UB at GeoCLEF 2006 Mguel E. Ruz (1), Stuart Shapro (2), June Abbas (1), Slva B. Southwck (1) and Davd Mark (3) State Unversty of New York at Buffalo (1) Department of Lbrary and Informaton Studes (2) Department

More information