ARTICLE IN PRESS. Applied Soft Computing xxx (2012) xxx xxx. Contents lists available at SciVerse ScienceDirect. Applied Soft Computing

Size: px
Start display at page:

Download "ARTICLE IN PRESS. Applied Soft Computing xxx (2012) xxx xxx. Contents lists available at SciVerse ScienceDirect. Applied Soft Computing"

Transcription

1 ASOC-11; o. of Pages 1 Appled Soft Computng xxx (1) xxx xxx Contents lsts avalable at ScVerse ScenceDrect Appled Soft Computng j ourna l ho mepage: A herarchcal partcle swarm optmzer wth latn samplng based memetc algorthm for numercal optmzaton Yong Peng a,, Bao-Lang Lu a,b a Center for Bran-lke Computng and Machne Intellgence, Department of Computer Scence and Engneerng, Shangha Jao Tong Unversty, Shangha, PR Chna b MoE-Mcrosoft Key Laboratory for Intellgent Computng and Intellgent Systems, Shangha Jao Tong Unversty, Shangha, PR Chna a r t c l e n f o Artcle hstory: Receved 31 December 11 Receved n revsed form 13 May 1 Accepted 1 May 1 Avalable onlne xxx Keywords: Memetc algorthm Partcle swarm optmzer Latn hypercube samplng Comprehensve learnng Cylndrcty a b s t r a c t Memetc algorthms, one type of algorthms nspred by nature, have been successfully appled to solve numerous optmzaton problems n dverse felds. In ths paper, we propose a new memetc computng model, usng a herarchcal partcle swarm optmzer (HPSO) and latn hypercube samplng (LHS) method. In the bottom layer of herarchcal PSO, several swarms evolve n parallel to avod beng trapped n local optma. The learnng strategy for each swarm s the well-known comprehensve learnng method wth a newly desgned mutaton operator. After the evoluton process accomplshed n bottom layer, one partcle for each swarm s selected as canddate to construct the swarm n the top layer, whch evolves by the same strategy employed n the bottom layer. The local search strategy based on LHS s mposed on partcles n the top layer every specfed number of generatons. The new memetc computng model s extensvely evaluated on a sute of 1 numercal optmzaton functons as well as the cylndrcty error evaluaton problem. Expermental results show that the proposed algorthm compares favorably wth conventonal PSO and several varants. 1 Elsever B.V. All rghts reserved. 1. Introducton Optmzaton has been a research hotspot for several decades. Many real-world optmzaton problems n engneerng are becomng ncreasngly complcated, so optmzaton algorthms wth hgh performance are needed [1,]. Unconstraned optmzaton problems can be formulated as D-dmensonal optmzaton problems over contnuous space mn f (x), x = [x 1, x,..., x D ] (1) Evolutonary algorthms, nspred by natural evoluton, have been wdely used as effectve tools to solve optmzaton problems. One class of nature nspred algorthms are swarm ntellgent algorthms. Partcle swarm optmzer (PSO) [3,] has attracted attenton n the academc and ndustral communty. Although PSO shares many smlartes wth evolutonary algorthms, the orgnal PSO does not use the tradtonal evoluton operators such as crossover and mutaton. PSO draws on the swarm behavor of brds flockng where they search for food n a collaboratve way. Each member, n the swarm, called a partcle, represents a potental soluton to the target problem and t adapts ts search patterns by learnng from ts own experence and other members experence. The partcle s a pont n the search space and t ams at fndng the global optmum Correspondng author. E-mal address: StanY.Peng@gmal.com (Y. Peng). whch s regarded as the locaton of food. Each partcle has two attrbutes called poston and velocty and ts drecton of flght s adjusted accordng to the experences of the swarm. The swarm as a whole searches for the global optmum n D-dmensonal feasblty space. The PSO algorthm s easy to understand and mplement, and has been proved to perform well on many optmzaton problems. However, t may easly get trapped n a local optmum for many reasons, such as the lack of dversty among partcles and overlearnng from the best partcle found so far. To mprove PSO s performance on complex numercal optmzaton problems, we propose a herarchcal PSO framework, n whch several swarms evolve n parallel towards the global optmum and we desgn a new mutaton operator to ncrease the dversty of swarms. After evolvng for a specfed number of generatons, a latn hypercube samplng method s used to execute the local search. Ths paper s organzed as follows. Secton ntroduces the orgnal PSO and some varants. Secton 3 descrbes the proposed herarchcal PSO wth latn samplng based memetc algorthm, ncludng four subsectons: herarchcal PSO framework, mutaton strategy, latn hypercube samplng based local search strategy and the overall framework of the proposed memetc algorthm. Secton gves the expermental results, descrbes the related parameter tunng process and compares the performance of the proposed algorthm on a sute of test problems to that of other PSO varants. Secton 5 gves conclusons and descrbes future work. 15-9/$ see front matter 1 Elsever B.V. All rghts reserved. for numercal optmzaton, Appl. Soft Comput. J. (1),

2 ASOC-11; o. of Pages 1 Y. Peng, B.-L. Lu / Appled Soft Computng xxx (1) xxx xxx Fg. 1. Orgnal partcle swarm optmzer.. Partcle swarm optmzers.1. Orgnal PSO PSO s a stochastc optmzaton algorthm whch smulates swarm behavor. The ndvduals move over a specfed D- dmensonal feasble space. As n a the genetc algorthm, the partcles n PSO are ntalzed wth random veloctes and postons. The algorthm adaptvely updates the velocty and poston of each partcle n the swarm by learnng from the good experences. In the orgnal PSO [3], the velocty V d and poston X d of the dth dmenson of the th partcle are updated as follows. V d : = V d + c 1 rand1 d (pbest d X d ) + c rand d (gbest d X d ) X d := X d + V d () where X = (X 1, X,..., X D ) s the poston of the th partcle and V = (V 1, V,..., V D ) represents velocty of partcle, pbest = (pbest 1, pbest,..., pbest D ) s the best prevous poston yeldng the best ftness value for the th partcle, gbest = (gbest 1, gbest,..., gbest D ) s the best poston found so far over the whole swarm, c 1 and c are the acceleraton constants, reflectng the weghtng of stochastc acceleraton terms that pull each partcle towards pbest and gbest postons, respectvely. rand1 d and rand d are two random numbers n the range [,1]. A partcle s velocty on each dmenson s confned to a maxmum magntude V max. If V d exceeds a pre-specfed postve constant value Vmax d, then the velocty on the dmenson s assgned to sgn( V d )V max d. The framework of the orgnal PSO s shown n Fg. 1. From the flow of the teratve process, we can fnd that each partcle fles to the global best partcle n the swarm; ths leads to a severe drawback of overlearnng from the best partcle. Consequently, the dversty of the whole swarm wll drop down dramatcally. If the best partcle does not share the same nche wth the global optmum, the partcles may easly get trapped n a local optmum. Snce PSO s ntroducton n 1995, many researchers have worked on mprovng ts performance n varous ways and many more effectve varants have been proposed; these wll be dscussed n next subsecton... Some varants of PSO Ths secton gves a bref survey of several PSO varants proposed n recent years. Sh and Eberhart [5] ntroduced nerta weght w nto the orgnal PSO algorthm, so the crteron for updatng the velocty was changed to V d := w V d + c 1 rand1 d (pbest d X d ) + c rand d (gbest d X d ). They ndcated that the nerta weght plays an mportant role n balancng the global and local search abltes; a large nerta weght encourages global search whle a small nerta weght encourages local search. Based on ths dea, the nerta weght s usually set to decrease lnearly over teratons. Dfferent types of topologes have been desgned to mprove PSO s performance n solvng dfferent optmzaton problems. Kennedy [,7] clamed that PSO wth a small neghborhood mght perform better on complex problems, whle PSO wth large neghborhood would perform better on smple problems. Suganthan [] defned the neghborhood of a partcle as the several nearest partcles n each teraton so that a dynamc neghborhood s computatonally ntensve. Jan et al. [9] examned several (3) for numercal optmzaton, Appl. Soft Comput. J. (1),

3 ASOC-11; o. of Pages 1 Y. Peng, B.-L. Lu / Appled Soft Computng xxx (1) xxx xxx 3 neghborhood topologes. The unfed PSO () proposed by Parsopoulos and Vrahats [1] combned the global verson and local verson of the orgnal PSO. Mendes et al. [11] used all the neghbors of the partcle to update the velocty nstead of the pbest and the gbest. The neghbors of each partcle were selected based on ts ftness value and the sze of neghborhood. Peram et al. [1] proposed the ftness-dstance-rato-based PSO (FDR-PSO). When updatng each velocty dmenson, the FDR-PSO algorthm selects one other partcle nbest, whch has a hgher ftness value and s nearer to the partcle beng updated. In comprehensve learnng PSO () [13], the velocty of each dmenson s nfluenced by pbest of every other partcle, whch ncreases the dversty of the swarm for multmodal optmzaton problems. In [1], several subswarms were used to coevolve wth each other. The entre populaton was shuffled at perodc stages and subswarms were reassgned. Yang and L [15] developed a herarchcal clusterng method to partton the orgnal swarm nto several subswarms, whch locate and track multple optma n dynamc envronments. Wang et al. [1] proposed a memetc algorthm based on a partcle swarm optmzer wth a rng-shaped topology; later, he mproved hs algorthm by parttonng partcles n the rng-shaped topology structure nto several speces whch can update nformaton n parallel [17]. Chen [1] proposed a two-layer PSO () for unconstraned optmzaton problems, where each subswarm was made to evolve based on the orgnal PSO. Although the orgnal PSO does not use the tradtonal evoluton operators such as crossover and mutaton, researchers ntroduced some other search technques ncludng evolutonary operators nto PSO to mprove ts performance. Evolutonary operators such as crossover, mutaton and selecton were used n [19 1]. In Ref. [], deflecton, stretchng and repulson technques are used to fnd as many mnma as possble by preventng partcles from movng to a prevously dscovered mnmal regon. Cooperatve PSO (CPSO-H) [3] uses one-dmensonal swarms to search each dmenson separately. In recent years, many advanced operators have been ntroduced to mprove PSO s performance. Lng et al. [] employed a wavelet-theory-based mutaton operaton to enhance PSO n explorng the soluton space more effectvely. Zhao [5] proposed a perturbed PSO (ppso) algorthm whch ntroduced the perturbed global best to deal wth the problem of premature convergence and dversty mantenance wthn the swarm. Gao et al. [] ncorporated the Henon map based mutaton operator, whch dvded the mutaton operator nto global and local mutaton operators; ths enabled the partcles to have a stronger exploraton ablty and fast convergence rate. Although many varants of PSO have been proposed, all of whch enhance the performance of orgnal PSO to some extent, the effectveness of these varants n dealng wth dverse problems wth dfferent characterstcs s stll unsatsfyng. For example, s performance on ll-condtoned problems s poor and an algorthm [7] wth hgh convergence speed s prone to shrnk towards local optma. So takng measures ncludng model structure, velocty updatng strategy, and the hybrd operators smultaneously accordng to the partcles behavor to mprove ts performance may be a feasble path to get a satsfactory result over dverse numercal optmzaton problems. 3. The proposed memetc algorthm In ths secton, we ntroduce the proposed memetc algorthm n detal; t s based on a herarchcal PSO framework and some search technques ncludng a local search strategy called the latn hypercube samplng method and a hybrd mutaton strategy. gbest 1 swarm 1 M swarm gbest Top layer gbest M swarm M Fg.. The archtecture of two-layer herarchcal PSO The herarchcal partcle swarm optmzer Bottom layer There are two versons of PSO, global and local, accordng to the approach of choosng gbest. In the global verson, each partcle can be nfluenced by the partcle wth best ftness n the whole swarm, whch causes all the partcles to move and converge quckly on one optmum pont n the search space. By contrast, the local verson only allows a partcle to be nfluenced by the best ftness partcle from ts neghborhood, whch makes the algorthm exhbt a good exploraton capacty because the populaton can slowly converge to the optmal space. Recently, many algorthms have been proposed to partton the populaton nto several subswarms based on Eucldean dstance [], ftness value [9] and some other metrcs [15,17]. These subswarms are dfferent defntons of the neghborhood, and each partcle can only nteract wth partcles n ts neghborhood to avod convergng too fast. Obvously, computng the Eucldean dstance s tme-consumng when the dmenson s hgh; ndvduals wth smlar ftness values whch are prone to be classfed nto the same group may be n dfferent nches. And the speces formaton method [17] s complcated and partally depends on the dstance of partcles. Here, we propose a two-layer herarchcal PSO model. There are M swarms n the bottom layer wth partcles n each swarm and only one swarm n the top layer. Fg. gves the archtecture of the herarchcal PSO. For each swarm n the bottom layer, partcles move towards the optmum based on the comprehensve learnng method [13] descrbed below, whch s a typcal local verson of PSO. After each teraton, M swarms n the bottom layer wll generate M best partcles whch wll stand chances nto the top layer. So n the top layer, the number of partcles s dentcal to the number of swarms n the bottom layer and they are traned by comprehensve learnng as well. The reasons for the selectng herarchcal PSO can be stated as follows. Frst, several swarms evolvng n parallel can have a good chance to reach the global optmum even f some of them stagnate n local optma. Second, the swarms are generated randomly whch saves tme n computng the neghborhood based on Eucldean dstance. Though smple, ths approach mght be effectve. For ths model, the movement of partcles n the bottom layer s smlar to the local search and the movement of partcles n the top layer s smlar to the global search. The best partcle n the top layer can nfluence partcles n the bottom layer ndrectly so that the speed of convergence wll slow down. So ths model can work for both explotaton and exploraton smultaneously. The comprehensve learnng method [13] used to tran partcles n the herarchcal PSO modal, s specfcally desgned for complex multmodal problems. Smply speakng, desgns a set of exemplars pbset d f(d) for each partcle to update ts velocty nstead of the tradtonal pbest and gbest, whch enlarges the search scope and enhances the performance of local search. Fg. 3 gves the flow of the comprehensve learnng method. for numercal optmzaton, Appl. Soft Comput. J. (1),

4 ASOC-11; o. of Pages 1 Y. Peng, B.-L. Lu / Appled Soft Computng xxx (1) xxx xxx Fg. 3. Comprehensve learnng PSO. 3.. Mutaton strategy Most varants of PSO adopt strateges to update the old velocty vector based on the partcles n neghborhood, so they have dffculty n adaptng quckly to the dfferent optmzaton stages of ll-condtoned problems. In ths subsecton, we propose a new mutaton operator, nspred by the mutaton operaton n Dfferental Evoluton (DE) [3]. It updates the partcles postons based on the dfferental nformaton and the pbest. The mutaton operator can be formulated as X d : = c (X d k X d j ) + c (pbest d X d ); c (.5,.); () where X d k and X d are the dth varables of two randomly selected j other partcles, (.5,.) represents the Gaussan dstrbuton wth mean.5 and standard devaton.. We carry out the mutaton operaton after updatng the pbest and gbest n both the bottom layer and the top layer based on the probablty Pm except the best partcle n each swarm. Ths operator wll generate a dsturbance when partcles poston are close to local optma Local search based on latn samplng Latn hypercube samplng, whch was proposed by Mckay [31], s a stratfed samplng approach. Ths paper employs ths samplng method to explot the excellent subspace whch has been found at present. Suppose that V s a hypercube wth dmenson n, of whch each dmenson x s denoted as [x l, xu ]( = 1,,..., n, x l and xu are the lower bound and the upper bound of dmenson, respectvely), then the algorthm of generatng H samples n ths hypercube V s Fg.. Here, a smple nstance s provded to demonstrate the Latn Samplng Process n detal. If the dmenson of the hypercube s two and the samplng scale s eght, then a satsfactory samplng matrx A s formed. [ ] T A = (5) Fg.. Latn samplng process n hypercube. for numercal optmzaton, Appl. Soft Comput. J. (1),

5 ASOC-11; o. of Pages 1 Y. Peng, B.-L. Lu / Appled Soft Computng xxx (1) xxx xxx Latn Hypercube Samplng n Dmensonal Space operator s employed to mantan the dversty of the swarms. To more explctly descrbe the proposed algorthm, the complete flow chat of MA-HPSOL s gven n Fg. 7. In the next secton, a large number of test problems are used to evaluate the performance of the proposed algorthm. Suppose that the computaton cost of one partcle n the approach s c, the cost of the mutaton operator s c m and the cost of Latn local search s c l, then the total computaton cost of MA-HPSOL for one generaton s (M + 1)(c + c m ) + M(c l ). But when solvng real-world problems, usually the ftness evaluaton accounts for the most tme as the PSO s hghly computatonally effcent. So the algorthm-related computaton tmes are not gven n ths paper. 3. Expermental study Fg. 5. Latn hypercube samplng n dmensonal space. and the correspondng samples n the hypercube s showed n Fg. 5. Latn hypercube samplng can be vewed as a space-fllng desgn, whch means that one and only one sample s selected n each row or column of each sub hypercube. So the samples generated by latn samplng are dstrbuted unformly n the hypercube space and ths s helpful to mantan the dversty of populaton. 3.. The proposed memetc algorthm In ths secton, we ntroduce the herarchcal PSO wth latn hypercube samplng based memetc algorthm (MA-HPSOL) as whole. Fg. gves the overall framework of the proposed algorthm. From Fg., we know that herarchcal PSO s the man framework of the proposed memetc algorthm. Swarms n the framework are traned by the comprehensve learnng method. The latn hypercube samplng based local search s performed every ten teratons. Furthermore, a dfferental nformaton based mutaton In ths secton, we evaluate the performance of MA-HPSOL by solvng 1 numercal optmzaton problems ncludng eght conventonal unmodal and multmodal benchmarks,sx rotated benchmarks and two composton problems. The test problems are scalable to any number of varables, so we manly employ the test problems wth 1 and 3 varables. We wll compare MA-HPSOL wth PSO wth nerta weght [5], [1], FDR-PSO [1], [13] and [1]..1. Test functons In ths subsecton, we choose 1 functon optmzaton problems to demonstrate the effectveness of the proposed MA-HPSOL algorthm. They can be classfed nto four types: unmodal, multmodal, rotated and composte problems. Table 1 tabulates the benchmark test functons wth ther notable characterstcs. The detaled characterstcs of these test functons can be found n [3]... Senstvty n relaton to parameters For the proposed MA-HPSOL algorthm, there are four parameters: M (the number of swarms n the bottom layer), (the number Fg.. The proposed memetc algorthm (MA-HPSOL). for numercal optmzaton, Appl. Soft Comput. J. (1),

6 ASOC-11; o. of Pages 1 Y. Peng, B.-L. Lu / Appled Soft Computng xxx (1) xxx xxx V pbest gbest k 1 gbest 1 gbest gbest M M gbests M M k k+1 mod(k,1)== Y gbest k Gen Y Fg. 7. Flow chat of proposed algorthm (MA-HPSOL). of partcles n each swarm), the samplng scale p and the length of each dmenson ı of the hypercube. Senstvty n relaton to M and. The expermental results of MA-HPSOL n optmzng functons 1,, and 1 wth the number of swarms M ncreased from to 1 n steps of 1 and the number of partcles n each swarm ncreased from to 1 n steps of 1. The values of other parameters are as follows: the samplng scale s 1, the length of each dmenson ı of the hypercube s twce the length of the correspondng dmenson of the selected partcle and the mutaton probablty s the nverse of dmensonalty D (here only D = 1 s taken nto consderaton). The maxmum number of functon evaluatons s set at 1,. The data are statstcal average values of the number of functon evaluatons obtaned from 3 ndependent runs. The results are shown n Fg.. From the expermental results of several test functons (functons 1,, and 1) depcted n Fg., we can easly fnd that small values of M and are encouraged by MA-HPSOL. Therefore, the number of swarms M and the number of partcles n each swarm are both set to 3 n all followng experments. Senstvty to the length of each dmenson ı of the hypercube. How to get a proper neghborhood for explotaton f a partcle has been selected to execute the local search? Because each dmenson of the partcle may be dfferent from others, the length of each dmenson ı of the hypercube should be adaptve to the selected partcles. Here, we propose a smple method to specfy ı whch shows excellent performance n our experments. The length of each dmenson ı of the hypercube s twce the length of the correspondng dmenson of the selected partcle. Senstvty to samplng scale p. It s dffcult to choose a proper samplng scale p because f we choose a bgger value, the generatons for evoluton wll be few and f we choose a smaller value, the neghborhood of a selected partcle may not be exploted suffcently (f the maxmum number of functon evaluatons s fxed). So we should get a balance between the samplng scale p and the generatons of evoluton. The expermental results of MA-HPSOL n optmzng functons and 1 wth the samplng scale p from 5 to 3 n steps of 5 are shown n Fg. 9. Both the number of swarms M and the number of partcles are set to 3, and other parameters are the same as mentoned above. From the expermental results of functons and 1 depcted n Fg. 9, t s obvous that MA-HPSOL gets better results when the samplng scale s set to 5, 1 and 15. Therefore, the samplng scale p n all followng experments s set to 1. So all the parameters for MA-HPSOL are shown n Table, where ı s set to means that the length of each dmenson ı of the hypercube s two tmes the length of the correspondng dmenson of the selected partcle and MAXFES stands for the maxmum number of functon evaluatons (1, dmenson). for numercal optmzaton, Appl. Soft Comput. J. (1),

7 ASOC-11; o. of Pages 1 Y. Peng, B.-L. Lu / Appled Soft Computng xxx (1) xxx xxx 7 Senstvty n relaton to parameters M and Senstvty n relaton to parameters M and 1 log1(ftness Value of fun1) M 1 log1(ftness Value of fun) 3 1 M 1 Senstvty n relaton to parameters M and Senstvty n relaton to parameters M and log1(ftness Value of fun) 1 M 1 log1(ftness Value of fun1) 1 M 1 Fg.. MA-HPSOL senstvty n relaton to M and. 5 x 1 5 Senstvty n relaton to samplng scale p x 1 9 Senstvty n relaton to samplng scale p 15 Ftness value of fun 3 1 Ftness value of fun Value of samplng scale p Value of samplng scale p Fg. 9. MA-HPSOL senstvty n relaton to samplng scale p. for numercal optmzaton, Appl. Soft Comput. J. (1),

8 ASOC-11; o. of Pages 1 Y. Peng, B.-L. Lu / Appled Soft Computng xxx (1) xxx xxx Table 1 Benchmark functons used n ths study. ton Range Characterstcs Optma D f 1(x) = =1 x [,1] Unmodal D f (x) = =1 (1(x x +1 ) + (x 1) ) [.,.] Unmodal ( ) D 1 f 3(x) = exp. D ( D ) =1 x 1 exp D ) + + e [ 3.7,3.7] Multmodal =1 D x ( ) f (x) = =1 D cos x + 1 [,] Multmodal =1 D f 5(x) = kmax =1 k=1 [ak cos(b k (x +.5))]} kmax D k=1 {ak cos(b k.5)} a =.5, b = 3, kmax = ; D [.5,.5] Multmodal f (x) = =1 {x 1 cos(x ) + 1} [ 5.1,5.1] Multmodal D f 7(x) = { =1 {y 1 cos(y ) + 1} x x <.5 y = round(x ) x.5, = 1,,..., D D [ 5.1,5.1] Multmodal f (x) = 1.99D {x sn( x.5 )} =1 [ 5,5] Multmodal f 9(x) = f 3(y), y = M x [ 3.7,3.7] Rotated f 1(x) = f (y), y = M x [,] Rotated f 11(x) = f 5(y), y = M x [.5,.5] Rotated f 1(x) = f (y), y = M x [ 5.1,5.1] Rotated f 13(x) = f 7(y), y = M x D [ 5.1,5.1] Rotated f 1(x) = 1.99D { =1 y sn( y.5 ) y 5 z =.1( y 5) y > 5, = 1,,..., D ; y = M (x.9) +.9 [ 5,5] Rotated f 15 = CF1 [ 5,5] Composton f 1 = CF [ 5,5] Composton Table Parameters settng for MA-HPSOL. Parameters Dmenson 1D 3D M, {3,3} {3,3} p 1 1 ı MAXFES 1,*1 1,*3.3. Expermental results of MA-HPSOL on test functons In ths secton, we wll gve the expermental results obtaned by MA-HPSOL n optmzng above-mentoned 1 functons wth 1 and 3 varables. Based on the parameter senstvty analyss, the parameters are set as shown n Table. Table 3 shows the Table 3 Statstcal results of MA-HPSOL n optmzng 1-D functons. tons Max Mn Mean Std f 1 f 1.35e 3 3.7e.39e 3.5e f 3 f f 5 f f 7 f.559e e1 1.1e.519e f 9 f 1 f 11 f 1 f 13 f e.75e 1.77e 9.395e 9 f e 9.57e3 3.1e.379e f e+ 1.75e+ 3.1e+.59e+ statstcal results of MA-HPSOL n optmzng the 1 functons wth ten varables (1-D functons) based on 3 ndependent runs, whch ncludes the maxmum, mnmum, mean and standard devaton. The termnaton crteron n ths experment s to run MA-HPSOL untl the number of functon evaluatons reaches the maxmum value 1,. Obvously MA-HPSOL performs very well on most of the 1 functons. For functons 1, 3,, 5,, 7, 9, 1, 11, 1 and 13, the maxmum, mnmum and mean values of the 3 runs are all equal to the optmal values. The performance of MA-HPSOL s stable enough because the dversty s kept on a hgher level to avod premature convergence. But when solvng the functons,, 1, 15 and 1, MA-HPSOL does not get accurate optmal results. Due to the ll-condtonal nature of functon (Rosenbrock problem) and functon (Schwefel problem), whch has optmal values at (1,1,..., 1) and (.9,.9,...,.9), t s hard to adapt quckly to the dfferent optmzaton stages. Also, functon 1 s the rotatonal verson of functon, so there s some dstance between the local optmum found (1.77e 9) and the global optmum. For the two composton functons, most of the solutons are obvously worse than the optmal values. The reason for the poor performance s that both functons are more challengng problems wth a randomly located global optmum and several randomly located deep local optma. They are asymmetrcal multmodal problems, wth dfferent propertes n dfferent areas. Due to the complex shape of the composton functons, t s dffcult to get the same accurate results as the benchmark functons. However, we fnd that MA-HPSOL gets relatvely good results of these two composte functons when compared wth some state-of-the-art algorthms, whch wll be shown n the followng part. The expermental results for MA-HPSOL n optmzng the 1 functons wth 3 varables (3-D functons) are shown n Table. The maxmum number of functon evaluatons s set at 3,. for numercal optmzaton, Appl. Soft Comput. J. (1),

9 ASOC-11; o. of Pages 1 Y. Peng, B.-L. Lu / Appled Soft Computng xxx (1) xxx xxx 9 Table Statstcal results of MA-HPSOL n optmzng 3-D functons. tons Max Mn Mean Std f e 31.9e e 319 f.39e e5 5.31e 1.75e 5 f e5 1.9e5 1.7e5 f f 5 f f 7 f.79e1 9.1e 9.19e f 9 f 1 f 11 f 1 f 13 f 1.5e1 5.93e 5.71e f 15.3e.579e.91e7.175e7 f e+1 1.7e+.591e+ 3.57e+ The other parameters are the same as those for 1-D functons. The statstcal results n Table are obtaned from 3 ndependent runs. As shown n Table, when solvng the functons, 5,, 7, 9, 1, 11, 1 and 13, the statstcal results ncludng the maxmum, mnmum and mean values for the 3 runs are all equal to the optmal values. The results of functons 1 and 3 are not so good as the results obtaned n the 1-D experment. The man reason accountng for ths phenomenon may be the lack of a suffcent number of partcles for explorng the feasble space. We use the same par of {M,}={3,3} for both 1-D and 3-D experments, whch means the total partcles n the populaton s 9. Ths number of partcles s proper for 1-D experments, and MA-HPSOL obtans promsng results as do some other algorthms [13]. But the landscape of test 3-D functons s so complex that t s dffcult to explore such a hgh dmensonal feasble space (D = 3) wth so few partcles. As the results of functons,, 1, 15 and 1 show, MA-HPSOL can get values whch are very close to the optma. Also, the standard are very small for all of these functons, whch means that MA-HPSOL exhbts excellent stablty over all 3 runs... Comparsons wth state-of-the-art algorthms In order to further verfy the effectveness of MA-HPSOL, we use experments to evaluate the performance of MA-HPSOL by comparng t wth fve exstng algorthms, [5], [1], FDR-PSO [1], [13] and [1]. For easy comparson wth state-ofthe-art algorthms, the populaton sze for all sx algorthms s set to 9. Any specfc parameters are set exactly the same as n the orgnal work. The termnaton crtera s to run the algorthms untl the number of functon evaluatons reaches the maxmum value 1,. All the results gven n Table 5 are based on 3 ndependent runs. Table 5 Results of sx algorthms for 1-D results. f 1 f f 3 f e ±.e.73e±1.e.35e ±.3135e.9e ±.175e ± 9.115e ± 1.3e 1.791e+ ± 1.7e+ 1.79e ± 1.11e FDR-PSO.3711e 9 ±.e 5.317e ± 1.375e 1.1e ± 9.e5 1.1e ±.735e 1.1e1 ± 5.3e1.131e+ ± 1.7e e5 ±.e.59e 3 ± 5.75e e9 ±.97e 5.111e+ ± 1.933e+ 3.59e ±.55e.5e ±.7759e MA-HPSOL ±.39e ± 3.5e ± ± f 5 f f 7 f.e ± 1.77e e+ ± 3.139e+ 3.7e+ ±.95e+ 5.7e+ ±.1e+ 1.19e ±.353e 1.113e+1 ±.1e e+ ± 1.39e+ 9.95e+ ±.1e+ FDR-PSO.57e 3 ± 1.55e 3.715e+ ±.739e+ 1.e+ ± e+.91e+ ± 1.995e+ ± ± ±.35e 5 ±.597e 5.99e3 ± 1.53e 1.79e+ ±.35e+ 1.7e ± 9.17e e+3 ± 3.71e+ MA-HPSOL ± ± ± 1.1e ±.519e f 9 f 1 f 11 f 1.99e ±.e 1.75e ± 7.59e 5.51e ± 7.31e 9.19e+ ±.11e e+ ± 1.39e+ 1.17e ± 7.331e.373e+ ± 1.79e e+1 ± 5.9e+ FDR-PSO 1.5e ± e 1.e ± 5.9e 3.951e ±.331e 1.177e+1 ±.999e+ 5.7e ± 1.31e 5.577e ± 1.399e 3 5.1e ±.571e.151e+ ± 1.17e e3 ± 5.953e3 3.55e ±.73e 9.35e ± 1.91e e+1 ±.55e+ MA-HPSOL ± ± ± ± f 13 f 1 f 15 f e+ ±.773e e+ ± 3.3e e+ ± 1.37e+ 1.55e+ ±.11e e+1 ±.e e+3 ± 3.955e+.7e+1 ± 7.51e e+ ± 1.3e+ FDR-PSO 1.7e+1 ± 3.19e+ 1.5e+3 ± 3.7e e+ ± 1.195e+ 1.1e+ ± 1.9e e+ ± e+.337e+ ± 1.373e+ 9.3e+ ±.59e+1.9e+ ±.551e+ 1.5e+1 ± 5.3e e+3 ± 3.39e+ 5.3e+1 ±.599e+1.1e+1 ± 1.1e+ MA-HPSOL ± 1.77e 9 ±.395e 9 3.1e ±.379e 9 3.1e+ ±.59e+ for numercal optmzaton, Appl. Soft Comput. J. (1),

10 ASOC-11; o. of Pages 1 1 Y. Peng, B.-L. Lu / Appled Soft Computng xxx (1) xxx xxx Table Results of sx algorthms for 3-D results. f 1 f f 3 f 5.3e ±.79e3.e+1 ±.7331e+ 1.91e ± 7.93e 3.55e ± 3.13e 5.753e ± 1.37e e+1 ±.33e+ 9.39e+ ± 3.e+.1e ± 1.997e FDR-PSO.135e 37 ± 1.399e e+1 ±.e+ 1.93e ±.9e.955e ±.557e.11e ±.3e e+1 ± e+ 1.1e ± 3.77e5 ± 5.5e ± 1.7e 3.e+1 ±.35e+1.391e+ ±.753e e+ ± 1.39e+1 MA-HPSOL 1.35e 319 ± 5.31e ± 1.75e 5 1.9e5 ± 1.7e5 ± f 5 f f 7 f.59e+ ± 1.719e e+1 ± 1.95e+1.77e+1 ± 1.7e+1 3.3e+3 ± 5.39e+ 1.75e+1 ± 3.713e+.7e+1 ±.e+1.e+1 ± 3.e e+3 ±.1e+ FDR-PSO 1.59e+ ± 1.315e+.53e+1 ± 7.95e+.7e+1 ± 1.13e e+3 ±.73e+ ± 9.3e ±.351e e+ ± 1.1e e+ ± 1.717e+.33e+ ±.19e+ 1.33e+ ± 3.77e e+1 ± 3.1e+1.99e+3 ± 7.391e+ MA-HPSOL ± ± ± 9.1e ± 9.19e f 9 f 1 f 11 f 1.37e ± 7.35e.5e ±.75e.3977e+ ±.e+.9e+1 ± 1.3e e+ ±.55e e ±.71e.33e+1 ±.7319e+ 1.99e+ ±.97e+1 FDR-PSO 1.751e+ ± 5.93e 1.99e ±.33e.7e+ ± 1.791e+.91e+1 ± 1.3e+1.119e 5 ± 1.e.71e ± 1.35e e+ ±.973e+ 3.95e+1 ±.77e+.713e+ ± 7.337e+.973e ± 1.39e 1.e+1 ± 7.e+ 1.3e+ ± 3.171e+1 MA-HPSOL ± ± ± ± f 13 f 1 f 15 f e+1 ±.7e+1.9e+3 ± e+ 3.e+1 ± 7.1e+1.1e+1 ±.957e+1 1.1e+ ±.7e e+3 ±.577e e+1 ± 1.e e+ ± 1.e+ FDR-PSO.375e+1 ± 1.9e+1.e+3 ±.339e+ 3.7e+1 ± 7.9e+1.35e+1 ± 1.377e+ 3.e+1 ±.595e+.51e+3 ± 5.753e+.35e 3 ±.3791e.15e+1 ± 7.3e e+ ± 3.3e e+3 ± 7.39e+ 5.19e+1 ± 3.57e e+ ± 3.591e+ MA-HPSOL ± 5.93e ± 5.71e.91e7 ±.175e7.591e+ ± 3.57e+ Furthermore, a dstance functon Index(D) for descrbng the mean dstance between the optmal soluton and the obtaned best soluton s defned as follows [33]. Index(D) = f opt(d) f best (D). () D where f opt (D) and f best (D) are the optmal soluton and the obtaned best soluton, respectvely. Ths metrc s usually used to compare the decreasng veloctes of the dfferences between the solutons obtaned by all knds of evolutonary algorthms and the target soluton. In ths paper, the optma for all the test functons are and the obtaned best solutons are usually very close to, so we use the log1 (f best (D)) nstead of f best (D) for narrowng the nterval of metrc. But the abs(log) functon s not monotonc so we modfy the Index(D) to Dst(D) as follows so that we can easly vsualze the results of each algorthm. Dst(D) = (log1(f best(d)) f opt (D)) (7) D Fg. 1 presents the Dst(D) values n terms of the best ftness value of the medan run of each algorthm for each test functon (D = 1). We record the best solutons every 5 functon evaluatons for each test problem wth total functon evaluatons 1,. So the nterval for the horzontal coordnate s [1,] and the vertcal coordnate shows the Dst(D). From the results n Table 5 and Fg. 1, we observe that MA- HPSOL surpasses all other algorthms on all functons except functon. The performance of n optmzng functon s superor to MA-HPSOL. However, when we run both and MA-HPSOL 5 ndependent tmes we fnd that holds a very small probablty (3/5) to trap n local optma and 3.77 whle MA-HPSOL stll obtan results wth precson 1. The convergence characterstc of MA-HPSOL s very promsng n optmzng unmodal, multmodal, rotated multmodal and composte problems. For, t just performs well on unmodal functon 1 and also other algorthms get good results. shows good convergence characterstcs on functons 1, 3 and 9 whle FDR- PSO has a relatvely good convergence for functons, 3 and 9. Ths s a reasonable phenomenon because functon 9 s just the rotatonal verson of functon 3. shows good convergence n optmzng functons 1, 3, 5,, 7, 9 and 1. Because partcles n MA-HPSOL are traned wth the comprehensve learnng method, MA-HPSOL and share some smlar convergence characterstcs n optmzng functons 1, 3,,, 9, 15 and 1 wth the dfference that MA-HPSOL converges faster than. Especally for functons 3,, 5,, 7, 9, 1, 11, 1 and 13, MA-HPSOL converges to the global optmum n less than, functon evaluatons on the whole. Ths s manly caused by mult-swarms n the herarchcal archtecture and the local search strategy. Table gves the means and standard devatons of the 3 runs of the sx algorthms on the sxteen test functons wth D = 3. As the convergence graphs are smlar to the 1-D problems, they are not presented here. It s a acd test for these algorthms holdng a populaton wth just 9 partcles. From the results n Table, we for numercal optmzaton, Appl. Soft Comput. J. (1),

11 ASOC-11; o. of Pages 1 Y. Peng, B.-L. Lu / Appled Soft Computng xxx (1) xxx xxx 11 5 ton 1. ton.1 5 Index: DIST(D=1) Index: DIST(D=1) ton 3. ton.. Index: DIST(D=1).... Index: DIST(D=1) ton 5. ton Index: DIST(D=1) Index: DIST(D=1) ton 7. ton. Index: DIST(D=1) Index: DIST(D=1) Fg. 1. The medan dst(d) values of 1-D test functons. for numercal optmzaton, Appl. Soft Comput. J. (1),

12 ASOC-11; o. of Pages 1 1 Y. Peng, B.-L. Lu / Appled Soft Computng xxx (1) xxx xxx. ton 9. ton 1.. Index: DIST(D=1)... Index: DIST(D=1) ton 11. ton 1.. Index: DIST(D=1)... Index: DIST(D=1) ton 13. ton 1... Index: DIST(D=1).... Index: DIST(D=1) ton 15.5 ton 1. Index: DIST(D=1) Index: DIST(D=1) Fg. 1. (contnued) for numercal optmzaton, Appl. Soft Comput. J. (1),

13 ASOC-11; o. of Pages 1 Y. Peng, B.-L. Lu / Appled Soft Computng xxx (1) xxx xxx 13 Table 7 Results of data set 1 cylndrcty error evaluaton. Parameter Improved GA [37] PSO [3] MA-HPSOL x y z.13 l m n Cylndrcty Fg. 11. Defnton of cylndrcty error. can observe that the performance of almost all algorthms except MA-HPSOL degrade dramatcally n optmzng hgh-dmensonal problems wth a small populaton sze. Takng for example, t can attan the precson of 1 wth populaton sze, but t only 1 3 wth populaton sze Cylndrcty error evaluaton based on MA-HPSOL In the past few years, many knds of evolutonary algorthms have contrbuted to optmze a wde range of manufacturng process [3 3], whose demands to be more robust, more flexble, more complex are ever ncreasng. Cylndrcal features have become one of the most mportant features n mechancal desgns. They contrbute sgnfcantly to fundamental mechancal products such as transmsson systems, revolvng devces and njecton molds, to acheve the ntended functonaltes. Therefore, evaluatng cylndrcty error precsely s very mportant n hgh precson manufacturng. Many attempts have been made for evaluatng the cylndrcty error [37,3]. The defnton of cylndrcty error can be stated as follows [39]. Fg. 11 llustrates the cross secton of a cylnder wth axs drecton n(l,m,1) and radus R. The projecton of a measured pont P onto the cylnder s F. Assumng the axs passes the pont Q(x, y, ), then the axs functon can be expressed as (x x )/l = (y y )/m = z. The dstance from P ( = 1,,..., ) to the axs s j k x x y y z l m 1 e = EP = QP n = n l + m + 1, () where means the length of a vector n the Eucldean space. Mathematcally, the cylndrcty error evaluaton can be formulated as an optmzaton problem wth parameter vector (x, y, l, m). Hence, the ftness functon of evaluatng cylndrcty error under mnmum zone cylnder (MZC) crteron s amng at mnmzng the objectve functon: f (x, y, l, m) = max(e ) mn(e ). (9) Here we wll evaluate the cylndrcty error by the above proposed MA-HPSOL algorthm and related parameters are set as follows: (1) The MA-HPSOL dependent parameters are set as shown n Table ; () Dmenson of partcles s, whch s the length of parameter vector (x, y, l, m); (3) Termnal condton: maxmum generatons 1. The remanng parameters are the same as [13]. The measurement data sets are ntroduced from Refs. [3,]. All parameters are ntalzed n [,1]. The evaluatng results are gven n Tables 7 and. As shown n Tables 7 and, the proposed MA- HPSOL algorthm s a compettve approach n cylndrcty error evaluaton, whch s obvously a complcated optmzaton problem. When comparng wth other types of evolutonary algorthms Table Results of data set cylndrcty error evaluaton. Parameter Improved GA [1] PSO-DE [39] MA-HPSOL x y z l m n Cylndrcty (Improved GA [37,1], PSO [3], PSO-DE [39]), the results obtaned by MA-HPSOL are better than that lsted n exstng lteratures. 5. Concluson and future work Ths paper presents a hgh performance memetc algorthm (MA-HPSOL) to deal wth complex numercal optmzaton problems. Wthn the framework of the proposed algorthm, there are three man components: an herarchcal partcle swarm optmzer for exploraton, a local search method based on latn hypercube samplng for explotaton and a mutaton operator usng dfferental nformaton. Concretely, the herarchcal PSO s composed of two layers: the bottom layer and the top layer. Partcles n each swarm of the bottom layer evolve ndependently, whch means each swarm s a nche wth no nfluence on other swarms. Global best poston n each swarm of the bottom layer becomes the canddate of the partcle n the top layer, so the global best poston n the swarm of the top layer steers the partcles n each swarm of the bottom layer ndrectly. The local search strategy, latn hypercube samplng, ams at explotng the best solutons found so far unformly. Both such exploraton and the explotaton operators can help keep the dversty of whole populaton on a hgher level to avod partcles trappng nto local optma. Even f partcles n one swarm are trapped n local optma, other swarms are also lkely to reach the global optma. Furthermore, a mutaton operator, amng at modfyng the partcles postons based on dfferental nformaton, s used. Accordng to the expermental results on 1 functons, the proposed memetc algorthm (MA-HPSOL) has excellent performance to fnd global optmal solutons. MA-HPSOL s used to evaluate the cylndrcty error and the expermental results show that t can obtan compettve performance as well. For our future work, two aspects, quanttatvely depctng the dversty of the whole populaton and mposng mutual communcaton among swarms n the bottom layer, wll be nvestgated n depth. Acknowledgments Ths work was partally supported by the atonal Basc Research Program of Chna (grant no. 9CB391) and the European Unon Seventh Framework Program (grant no. 719). We would lke to thank Prof. P.. Suganthan for provdng the source code of. for numercal optmzaton, Appl. Soft Comput. J. (1),

14 ASOC-11; o. of Pages 1 1 Y. Peng, B.-L. Lu / Appled Soft Computng xxx (1) xxx xxx References [1] H. Azamathulla, F. Wu, Support vector machne approach for longtudnal dsperson coeffcents n natural streams, Appled Soft Computng 11 () (11) [] H. Azamathulla, A. Ghan, C. Chang, Z. Hasan,. Zakara, Machne learnng approach to predct sedment load a case study, Clean-Sol, Ar, Water 3 (1) (1) [3] J. Kennedy, R. Eberhart, Partcle swarm optmzaton, n: Proceedngs of IEEE Internatonal Conference on eural etworks, vol., 1995, pp [] R. Eberhart, J. Kennedy, A new optmzer usng partcle swarm theory, n: Proceedngs of the Sxth Internatonal Symposum on Mcro Machne and Human Scence, 1995, pp [5] Y. Sh, R. Eberhart, A modfed partcle swarm optmzer, n: Proceedngs of IEEE Internatonal Conference on Evolutonary Computaton, 199, pp [] J. Kennedy, Small worlds and mega-mnds: effects of neghborhood topology on partcle swarm performance, n: Proceedngs of IEEE Congress on Evolutonary Computaton, vol. 3, [7] J. Kennedy, R. Mendes, Populaton structure and partcle swarm performance, n: Proceedngs of IEEE Congress on Evolutonary Computaton, vol., IEEE,, pp [] P. Suganthan, Partcle swarm optmser wth neghbourhood operator, n: Proceedngs of IEEE Congress on Evolutonary Computaton, vol. 3, [9] W. Jan, Y. Xue, J. Qan, Improved partcle swarm optmzaton algorthms study based on the neghborhoods topologes, n: Proceedngs of IEEE Annual Conference of Industral Electroncs Socety, vol. 3,, pp [1] K. Parsopoulos, M. Vrahats, : a unfed partcle swarm optmzaton scheme, Lecture Seres on Computer and Computatonal Scences 1 () 73. [11] R. Mendes, J. Kennedy, J. eves, The fully nformed partcle swarm: smpler, maybe better, IEEE Transactons on Evolutonary Computaton (3) () 1. [1] T. Peram, K. Veeramachanen, C. Mohan, Ftness dstance-rato based partcle swarm optmzaton, n: Proceedngs of IEEE Symposum on Swarm Intellgence, 3, pp [13] J. Lang, A. Qn, P. Suganthan, S. Baskar, Comprehensve learnng partcle swarm optmzer for global optmzaton of multmodal functons, IEEE Transactons on Evolutonary Computaton 1 (3) () [1] Y. Jang, T. Hu, C. Huang, X. Wu, An mproved partcle swarm optmzaton algorthm, Appled Mathematcs and Computaton 193 (1) (7) [15] S. Yang, C. L, A clusterng partcle swarm optmzer for locatng and trackng multple optma n dynamc envronments, IEEE Transactons on Evolutonary Computaton 1 () (1) [1] H. Wang, S. Yang, W. Ip, D. Wang, A partcle swarm optmzaton based memetc algorthm for dynamc optmzaton problems, atural Computng 9 (3) (1) [17] H. Wang, S. Yang, W.H. Ip, D. Wang, A memetc partcle swarm optmsaton algorthm for dynamc mult-modal optmsaton problems, Internatonal Journal of Systems Scence 3 (7) (1) [1] C. Chen, Two-layer partcle swarm optmzaton for unconstraned optmzaton problems, Appled soft computng 11 (1) (11) [19] P. Angelne, Usng selecton to mprove partcle swarm optmzaton, n: Proceedngs of IEEE Internatonal Conference on Evolutonary Computaton, 199, pp. 9. [] M. Lovbjerg, T. Rasmussen, T. Krnk, Hybrd partcle swarm optmser wth breedng and subpopulatons, n: Proceedngs of the Thrd Genetc and Evolutonary Computaton Conference, vol. 1, Cteseer, 1, pp [1] V. Mranda,. Fonseca, EPSO-evolutonary partcle swarm optmzaton, a new algorthm wth applcatons n power systems, n: Transmsson and Dstrbuton Conference and Exhbton: Asa Pacfc. IEEE/PES, vol.,, pp [] K. Parsopoulos, M. Vrahats, On the computaton of all global mnmzers through partcle swarm optmzaton, IEEE Transactons on Evolutonary Computaton (3) () 11. [3] F. Van den Bergh, A. Engelbrecht, A cooperatve approach to partcle swarm optmzaton, IEEE Transactons on Evolutonary Computaton (3) () [] S. Lng, H. Iu, K. Chan, H. Lam, B. Yeung, F. Leung, Hybrd partcle swarm optmzaton wth wavelet mutaton and ts ndustral applcatons, IEEE Transactons on Systems, Man, and Cybernetcs, Part B 3 (3) () [5] X. Zhao, A perturbed partcle swarm algorthm for numercal optmzaton, Appled Soft Computng 1 (1) (1) [] H. Gao, W. Xu, Partcle swarm algorthm wth hybrd mutaton strategy, Appled Soft Computng 11 () (11) [7] S. Hseh, T. Sun, C. Lu, S. Tsa, Effcent populaton utlzaton strategy for partcle swarm optmzer, IEEE Transactons on Systems, Man, and Cybernetcs: Part B 39 () (9) 5. [] R. Brts, A. Engelbrecht, F. Van den Bergh, Solvng systems of unconstraned equatons usng partcle swarm optmzaton, n: Proceedngs of IEEE Internatonal Conference on Systems, Man and Cybernetcs, vol. 3,, pp [9]. Huy, O. Soon, L. Hot,. Krasnogor, Adaptve cellular memetc algorthms, Evolutonary Computaton 17 () (9) [3] A. Qn, V. Huang, P. Suganthan, Dfferental evoluton algorthm wth strategy adaptaton for global numercal optmzaton, IEEE Transactons on Evolutonary Computaton 13 () (9) [31] M. McKay, R. Beckman, W. Conover, A comparson of three methods for selectng values of nput varables n the analyss of output from a computer code, Technometrcs (1979) [3] J. Lang, P. Suganthan, K. Deb, ovel composton test functons for numercal global optmzaton, n: Proceedngs of IEEE Symposum on Swarm Intellgence, 5, pp. 75. [33] S. Ho, L. Shu, J. Chen, Intellgent evolutonary algorthms for large parameter optmzaton problems, IEEE Transactons on Evolutonary Computaton () () [3] K. Chan, C. Kwong, Y. Tsm, A genetc programmng based fuzzy regresson approach to modellng manufacturng processes, Internatonal Journal of Producton Research (7) (1) [35] K. Chan, C. Kwong, Y. Tsm, Modellng and optmzaton of flud dspensng for electronc packagng usng neural fuzzy networks and genetc algorthms, Engneerng Applcatons of Artfcal Intellgence 3 (1) (1) 1. [3] K. Chan, C. Kwong, H. Jang, M. Aydn, T. Fogarty, A new orthogonal array based crossover, wth analyss of gene nteractons, for evolutonary algorthms and ts applcaton to car door desgn, Expert Systems wth Applcatons 37 (5) (1) [37] H. Ln, Y. Peng, Evaluaton of cylndrcty error based on an mproved GA wth unform ntal populaton, n: Proceedngs of IITA Internatonal Conference on Control, Automaton and Systems Engneerng, IEEE, 9, pp [3] J. Mao, Y. Cao, J. Yang, Implementaton uncertanty evaluaton of cylndrcty errors based on geometrcal product specfcaton (GPS), Measurement (5) (9) [39] X. Zhang, X. Jang, P. Scott, A relable method of mnmum zone evaluaton of cylndrcty and concty from coordnate measurement data, Precson Engneerng 35 (3) (11) 9. [] K. Carr, P. Ferrera, Verfcaton of form tolerances part II: cylndrcty and straghtness of a medan lne, Precson Engneerng 17 () (1995) [1] X. Wen, A. Song, An mproved genetc algorthm for planar and spatal straghtness error evaluaton, Internatonal Journal of Machne Tools and Manufacture 3 (11) (3) for numercal optmzaton, Appl. Soft Comput. J. (1),

Problem Definitions and Evaluation Criteria for Computational Expensive Optimization

Problem Definitions and Evaluation Criteria for Computational Expensive Optimization Problem efntons and Evaluaton Crtera for Computatonal Expensve Optmzaton B. Lu 1, Q. Chen and Q. Zhang 3, J. J. Lang 4, P. N. Suganthan, B. Y. Qu 6 1 epartment of Computng, Glyndwr Unversty, UK Faclty

More information

Cluster Analysis of Electrical Behavior

Cluster Analysis of Electrical Behavior Journal of Computer and Communcatons, 205, 3, 88-93 Publshed Onlne May 205 n ScRes. http://www.scrp.org/ournal/cc http://dx.do.org/0.4236/cc.205.350 Cluster Analyss of Electrcal Behavor Ln Lu Ln Lu, School

More information

EVALUATION OF THE PERFORMANCES OF ARTIFICIAL BEE COLONY AND INVASIVE WEED OPTIMIZATION ALGORITHMS ON THE MODIFIED BENCHMARK FUNCTIONS

EVALUATION OF THE PERFORMANCES OF ARTIFICIAL BEE COLONY AND INVASIVE WEED OPTIMIZATION ALGORITHMS ON THE MODIFIED BENCHMARK FUNCTIONS Academc Research Internatonal ISS-L: 3-9553, ISS: 3-9944 Vol., o. 3, May 0 EVALUATIO OF THE PERFORMACES OF ARTIFICIAL BEE COLOY AD IVASIVE WEED OPTIMIZATIO ALGORITHMS O THE MODIFIED BECHMARK FUCTIOS Dlay

More information

Clustering Algorithm Combining CPSO with K-Means Chunqin Gu 1, a, Qian Tao 2, b

Clustering Algorithm Combining CPSO with K-Means Chunqin Gu 1, a, Qian Tao 2, b Internatonal Conference on Advances n Mechancal Engneerng and Industral Informatcs (AMEII 05) Clusterng Algorthm Combnng CPSO wth K-Means Chunqn Gu, a, Qan Tao, b Department of Informaton Scence, Zhongka

More information

Complexity Analysis of Problem-Dimension Using PSO

Complexity Analysis of Problem-Dimension Using PSO Proceedngs of the 7th WSEAS Internatonal Conference on Evolutonary Computng, Cavtat, Croata, June -4, 6 (pp45-5) Complexty Analyss of Problem-Dmenson Usng PSO BUTHAINAH S. AL-KAZEMI AND SAMI J. HABIB,

More information

Parallelism for Nested Loops with Non-uniform and Flow Dependences

Parallelism for Nested Loops with Non-uniform and Flow Dependences Parallelsm for Nested Loops wth Non-unform and Flow Dependences Sam-Jn Jeong Dept. of Informaton & Communcaton Engneerng, Cheonan Unversty, 5, Anseo-dong, Cheonan, Chungnam, 330-80, Korea. seong@cheonan.ac.kr

More information

Meta-heuristics for Multidimensional Knapsack Problems

Meta-heuristics for Multidimensional Knapsack Problems 2012 4th Internatonal Conference on Computer Research and Development IPCSIT vol.39 (2012) (2012) IACSIT Press, Sngapore Meta-heurstcs for Multdmensonal Knapsack Problems Zhbao Man + Computer Scence Department,

More information

Determining the Optimal Bandwidth Based on Multi-criterion Fusion

Determining the Optimal Bandwidth Based on Multi-criterion Fusion Proceedngs of 01 4th Internatonal Conference on Machne Learnng and Computng IPCSIT vol. 5 (01) (01) IACSIT Press, Sngapore Determnng the Optmal Bandwdth Based on Mult-crteron Fuson Ha-L Lang 1+, Xan-Mn

More information

Learning the Kernel Parameters in Kernel Minimum Distance Classifier

Learning the Kernel Parameters in Kernel Minimum Distance Classifier Learnng the Kernel Parameters n Kernel Mnmum Dstance Classfer Daoqang Zhang 1,, Songcan Chen and Zh-Hua Zhou 1* 1 Natonal Laboratory for Novel Software Technology Nanjng Unversty, Nanjng 193, Chna Department

More information

CHAPTER 2 PROPOSED IMPROVED PARTICLE SWARM OPTIMIZATION

CHAPTER 2 PROPOSED IMPROVED PARTICLE SWARM OPTIMIZATION 24 CHAPTER 2 PROPOSED IMPROVED PARTICLE SWARM OPTIMIZATION The present chapter proposes an IPSO approach for multprocessor task schedulng problem wth two classfcatons, namely, statc ndependent tasks and

More information

Optimal Design of Nonlinear Fuzzy Model by Means of Independent Fuzzy Scatter Partition

Optimal Design of Nonlinear Fuzzy Model by Means of Independent Fuzzy Scatter Partition Optmal Desgn of onlnear Fuzzy Model by Means of Independent Fuzzy Scatter Partton Keon-Jun Park, Hyung-Kl Kang and Yong-Kab Km *, Department of Informaton and Communcaton Engneerng, Wonkwang Unversty,

More information

NUMERICAL SOLVING OPTIMAL CONTROL PROBLEMS BY THE METHOD OF VARIATIONS

NUMERICAL SOLVING OPTIMAL CONTROL PROBLEMS BY THE METHOD OF VARIATIONS ARPN Journal of Engneerng and Appled Scences 006-017 Asan Research Publshng Network (ARPN). All rghts reserved. NUMERICAL SOLVING OPTIMAL CONTROL PROBLEMS BY THE METHOD OF VARIATIONS Igor Grgoryev, Svetlana

More information

Content Based Image Retrieval Using 2-D Discrete Wavelet with Texture Feature with Different Classifiers

Content Based Image Retrieval Using 2-D Discrete Wavelet with Texture Feature with Different Classifiers IOSR Journal of Electroncs and Communcaton Engneerng (IOSR-JECE) e-issn: 78-834,p- ISSN: 78-8735.Volume 9, Issue, Ver. IV (Mar - Apr. 04), PP 0-07 Content Based Image Retreval Usng -D Dscrete Wavelet wth

More information

Classifier Swarms for Human Detection in Infrared Imagery

Classifier Swarms for Human Detection in Infrared Imagery Classfer Swarms for Human Detecton n Infrared Imagery Yur Owechko, Swarup Medasan, and Narayan Srnvasa HRL Laboratores, LLC 3011 Malbu Canyon Road, Malbu, CA 90265 {owechko, smedasan, nsrnvasa}@hrl.com

More information

A Binarization Algorithm specialized on Document Images and Photos

A Binarization Algorithm specialized on Document Images and Photos A Bnarzaton Algorthm specalzed on Document mages and Photos Ergna Kavalleratou Dept. of nformaton and Communcaton Systems Engneerng Unversty of the Aegean kavalleratou@aegean.gr Abstract n ths paper, a

More information

Support Vector Machines

Support Vector Machines /9/207 MIST.6060 Busness Intellgence and Data Mnng What are Support Vector Machnes? Support Vector Machnes Support Vector Machnes (SVMs) are supervsed learnng technques that analyze data and recognze patterns.

More information

Network Intrusion Detection Based on PSO-SVM

Network Intrusion Detection Based on PSO-SVM TELKOMNIKA Indonesan Journal of Electrcal Engneerng Vol.1, No., February 014, pp. 150 ~ 1508 DOI: http://dx.do.org/10.11591/telkomnka.v1.386 150 Network Intruson Detecton Based on PSO-SVM Changsheng Xang*

More information

Improvement of Spatial Resolution Using BlockMatching Based Motion Estimation and Frame. Integration

Improvement of Spatial Resolution Using BlockMatching Based Motion Estimation and Frame. Integration Improvement of Spatal Resoluton Usng BlockMatchng Based Moton Estmaton and Frame Integraton Danya Suga and Takayuk Hamamoto Graduate School of Engneerng, Tokyo Unversty of Scence, 6-3-1, Nuku, Katsuska-ku,

More information

An Optimal Algorithm for Prufer Codes *

An Optimal Algorithm for Prufer Codes * J. Software Engneerng & Applcatons, 2009, 2: 111-115 do:10.4236/jsea.2009.22016 Publshed Onlne July 2009 (www.scrp.org/journal/jsea) An Optmal Algorthm for Prufer Codes * Xaodong Wang 1, 2, Le Wang 3,

More information

Subspace clustering. Clustering. Fundamental to all clustering techniques is the choice of distance measure between data points;

Subspace clustering. Clustering. Fundamental to all clustering techniques is the choice of distance measure between data points; Subspace clusterng Clusterng Fundamental to all clusterng technques s the choce of dstance measure between data ponts; D q ( ) ( ) 2 x x = x x, j k = 1 k jk Squared Eucldean dstance Assumpton: All features

More information

Recommended Items Rating Prediction based on RBF Neural Network Optimized by PSO Algorithm

Recommended Items Rating Prediction based on RBF Neural Network Optimized by PSO Algorithm Recommended Items Ratng Predcton based on RBF Neural Network Optmzed by PSO Algorthm Chengfang Tan, Cayn Wang, Yuln L and Xx Q Abstract In order to mtgate the data sparsty and cold-start problems of recommendaton

More information

Classifier Selection Based on Data Complexity Measures *

Classifier Selection Based on Data Complexity Measures * Classfer Selecton Based on Data Complexty Measures * Edth Hernández-Reyes, J.A. Carrasco-Ochoa, and J.Fco. Martínez-Trndad Natonal Insttute for Astrophyscs, Optcs and Electroncs, Lus Enrque Erro No.1 Sta.

More information

Multi-objective Optimization Using Self-adaptive Differential Evolution Algorithm

Multi-objective Optimization Using Self-adaptive Differential Evolution Algorithm Mult-objectve Optmzaton Usng Self-adaptve Dfferental Evoluton Algorthm V. L. Huang, S. Z. Zhao, R. Mallpedd and P. N. Suganthan Abstract - In ths paper, we propose a Multobjectve Self-adaptve Dfferental

More information

A Clustering Algorithm Solution to the Collaborative Filtering

A Clustering Algorithm Solution to the Collaborative Filtering Internatonal Journal of Scence Vol.4 No.8 017 ISSN: 1813-4890 A Clusterng Algorthm Soluton to the Collaboratve Flterng Yongl Yang 1, a, Fe Xue, b, Yongquan Ca 1, c Zhenhu Nng 1, d,* Hafeng Lu 3, e 1 Faculty

More information

Journal of Chemical and Pharmaceutical Research, 2014, 6(6): Research Article. A selective ensemble classification method on microarray data

Journal of Chemical and Pharmaceutical Research, 2014, 6(6): Research Article. A selective ensemble classification method on microarray data Avalable onlne www.ocpr.com Journal of Chemcal and Pharmaceutcal Research, 2014, 6(6):2860-2866 Research Artcle ISSN : 0975-7384 CODEN(USA) : JCPRC5 A selectve ensemble classfcaton method on mcroarray

More information

A mathematical programming approach to the analysis, design and scheduling of offshore oilfields

A mathematical programming approach to the analysis, design and scheduling of offshore oilfields 17 th European Symposum on Computer Aded Process Engneerng ESCAPE17 V. Plesu and P.S. Agach (Edtors) 2007 Elsever B.V. All rghts reserved. 1 A mathematcal programmng approach to the analyss, desgn and

More information

Performance Evaluation of Information Retrieval Systems

Performance Evaluation of Information Retrieval Systems Why System Evaluaton? Performance Evaluaton of Informaton Retreval Systems Many sldes n ths secton are adapted from Prof. Joydeep Ghosh (UT ECE) who n turn adapted them from Prof. Dk Lee (Unv. of Scence

More information

Smoothing Spline ANOVA for variable screening

Smoothing Spline ANOVA for variable screening Smoothng Splne ANOVA for varable screenng a useful tool for metamodels tranng and mult-objectve optmzaton L. Rcco, E. Rgon, A. Turco Outlne RSM Introducton Possble couplng Test case MOO MOO wth Game Theory

More information

SLAM Summer School 2006 Practical 2: SLAM using Monocular Vision

SLAM Summer School 2006 Practical 2: SLAM using Monocular Vision SLAM Summer School 2006 Practcal 2: SLAM usng Monocular Vson Javer Cvera, Unversty of Zaragoza Andrew J. Davson, Imperal College London J.M.M Montel, Unversty of Zaragoza. josemar@unzar.es, jcvera@unzar.es,

More information

The Research of Support Vector Machine in Agricultural Data Classification

The Research of Support Vector Machine in Agricultural Data Classification The Research of Support Vector Machne n Agrcultural Data Classfcaton Le Sh, Qguo Duan, Xnmng Ma, Me Weng College of Informaton and Management Scence, HeNan Agrcultural Unversty, Zhengzhou 45000 Chna Zhengzhou

More information

Problem Definitions and Evaluation Criteria for the CEC 2015 Competition on Learning-based Real-Parameter Single Objective Optimization

Problem Definitions and Evaluation Criteria for the CEC 2015 Competition on Learning-based Real-Parameter Single Objective Optimization Problem Defntons and Evaluaton Crtera for the CEC 15 Competton on Learnng-based Real-Parameter Sngle Objectve Optmzaton J. J. Lang 1, B. Y. Qu, P. N. Suganthan 3, Q. Chen 4 1 School of Electrcal Engneerng,

More information

Virtual Machine Migration based on Trust Measurement of Computer Node

Virtual Machine Migration based on Trust Measurement of Computer Node Appled Mechancs and Materals Onlne: 2014-04-04 ISSN: 1662-7482, Vols. 536-537, pp 678-682 do:10.4028/www.scentfc.net/amm.536-537.678 2014 Trans Tech Publcatons, Swtzerland Vrtual Machne Mgraton based on

More information

Load Balancing for Hex-Cell Interconnection Network

Load Balancing for Hex-Cell Interconnection Network Int. J. Communcatons, Network and System Scences,,, - Publshed Onlne Aprl n ScRes. http://www.scrp.org/journal/jcns http://dx.do.org/./jcns.. Load Balancng for Hex-Cell Interconnecton Network Saher Manaseer,

More information

A Notable Swarm Approach to Evolve Neural Network for Classification in Data Mining

A Notable Swarm Approach to Evolve Neural Network for Classification in Data Mining A Notable Swarm Approach to Evolve Neural Network for Classfcaton n Data Mnng Satchdananda Dehur 1, Bjan Bhar Mshra 2 and Sung-Bae Cho 1 1 Soft Computng Laboratory, Department of Computer Scence, Yonse

More information

An Improved Particle Swarm Optimization for Feature Selection

An Improved Particle Swarm Optimization for Feature Selection Journal of Bonc Engneerng 8 (20)?????? An Improved Partcle Swarm Optmzaton for Feature Selecton Yuannng Lu,2, Gang Wang,2, Hulng Chen,2, Hao Dong,2, Xaodong Zhu,2, Sujng Wang,2 Abstract. College of Computer

More information

NGPM -- A NSGA-II Program in Matlab

NGPM -- A NSGA-II Program in Matlab Verson 1.4 LIN Song Aerospace Structural Dynamcs Research Laboratory College of Astronautcs, Northwestern Polytechncal Unversty, Chna Emal: lsssswc@163.com 2011-07-26 Contents Contents... 1. Introducton...

More information

A MOVING MESH APPROACH FOR SIMULATION BUDGET ALLOCATION ON CONTINUOUS DOMAINS

A MOVING MESH APPROACH FOR SIMULATION BUDGET ALLOCATION ON CONTINUOUS DOMAINS Proceedngs of the Wnter Smulaton Conference M E Kuhl, N M Steger, F B Armstrong, and J A Jones, eds A MOVING MESH APPROACH FOR SIMULATION BUDGET ALLOCATION ON CONTINUOUS DOMAINS Mark W Brantley Chun-Hung

More information

Collaboratively Regularized Nearest Points for Set Based Recognition

Collaboratively Regularized Nearest Points for Set Based Recognition Academc Center for Computng and Meda Studes, Kyoto Unversty Collaboratvely Regularzed Nearest Ponts for Set Based Recognton Yang Wu, Mchhko Mnoh, Masayuk Mukunok Kyoto Unversty 9/1/013 BMVC 013 @ Brstol,

More information

Optimizing SVR using Local Best PSO for Software Effort Estimation

Optimizing SVR using Local Best PSO for Software Effort Estimation Journal of Informaton Technology and Computer Scence Volume 1, Number 1, 2016, pp. 28 37 Journal Homepage: www.jtecs.ub.ac.d Optmzng SVR usng Local Best PSO for Software Effort Estmaton Dnda Novtasar 1,

More information

CHAPTER 4 OPTIMIZATION TECHNIQUES

CHAPTER 4 OPTIMIZATION TECHNIQUES 48 CHAPTER 4 OPTIMIZATION TECHNIQUES 4.1 INTRODUCTION Unfortunately no sngle optmzaton algorthm exsts that can be appled effcently to all types of problems. The method chosen for any partcular case wll

More information

An Efficient Genetic Algorithm Based Approach for the Minimum Graph Bisection Problem

An Efficient Genetic Algorithm Based Approach for the Minimum Graph Bisection Problem 118 An Effcent Genetc Algorthm Based Approach for the Mnmum Graph Bsecton Problem Zh-Qang Chen, Rong-Long WAG and Kozo OKAZAKI Faculty of Engneerng, Unversty of Fuku, Bunkyo 3-9-1,Fuku-sh, Japan 910-8507

More information

THE PATH PLANNING ALGORITHM AND SIMULATION FOR MOBILE ROBOT

THE PATH PLANNING ALGORITHM AND SIMULATION FOR MOBILE ROBOT Journal of Theoretcal and Appled Informaton Technology 30 th Aprl 013. Vol. 50 No.3 005-013 JATIT & LLS. All rghts reserved. ISSN: 199-8645 www.jatt.org E-ISSN: 1817-3195 THE PATH PLANNING ALGORITHM AND

More information

A New Approach For the Ranking of Fuzzy Sets With Different Heights

A New Approach For the Ranking of Fuzzy Sets With Different Heights New pproach For the ankng of Fuzzy Sets Wth Dfferent Heghts Pushpnder Sngh School of Mathematcs Computer pplcatons Thapar Unversty, Patala-7 00 Inda pushpndersnl@gmalcom STCT ankng of fuzzy sets plays

More information

Maximum Variance Combined with Adaptive Genetic Algorithm for Infrared Image Segmentation

Maximum Variance Combined with Adaptive Genetic Algorithm for Infrared Image Segmentation Internatonal Conference on Logstcs Engneerng, Management and Computer Scence (LEMCS 5) Maxmum Varance Combned wth Adaptve Genetc Algorthm for Infrared Image Segmentaton Huxuan Fu College of Automaton Harbn

More information

A Load-balancing and Energy-aware Clustering Algorithm in Wireless Ad-hoc Networks

A Load-balancing and Energy-aware Clustering Algorithm in Wireless Ad-hoc Networks A Load-balancng and Energy-aware Clusterng Algorthm n Wreless Ad-hoc Networks Wang Jn, Shu Le, Jnsung Cho, Young-Koo Lee, Sungyoung Lee, Yonl Zhong Department of Computer Engneerng Kyung Hee Unversty,

More information

Natural Computing. Lecture 13: Particle swarm optimisation INFR /11/2010

Natural Computing. Lecture 13: Particle swarm optimisation INFR /11/2010 Natural Computng Lecture 13: Partcle swarm optmsaton Mchael Herrmann mherrman@nf.ed.ac.uk phone: 0131 6 517177 Informatcs Forum 1.42 INFR09038 5/11/2010 Swarm ntellgence Collectve ntellgence: A super-organsm

More information

A Fast Visual Tracking Algorithm Based on Circle Pixels Matching

A Fast Visual Tracking Algorithm Based on Circle Pixels Matching A Fast Vsual Trackng Algorthm Based on Crcle Pxels Matchng Zhqang Hou hou_zhq@sohu.com Chongzhao Han czhan@mal.xjtu.edu.cn Ln Zheng Abstract: A fast vsual trackng algorthm based on crcle pxels matchng

More information

A Novel Deluge Swarm Algorithm for Optimization Problems

A Novel Deluge Swarm Algorithm for Optimization Problems A Novel eluge Swarm Algorthm for Optmzaton Problems Anahta Samad,* - Mohammad Reza Meybod Scence and Research Branch, Islamc Azad Unversty, Qazvn, Iran Soft Computng Laboratory, Computer Engneerng and

More information

Support Vector Machines

Support Vector Machines Support Vector Machnes Decson surface s a hyperplane (lne n 2D) n feature space (smlar to the Perceptron) Arguably, the most mportant recent dscovery n machne learnng In a nutshell: map the data to a predetermned

More information

Chinese Word Segmentation based on the Improved Particle Swarm Optimization Neural Networks

Chinese Word Segmentation based on the Improved Particle Swarm Optimization Neural Networks Chnese Word Segmentaton based on the Improved Partcle Swarm Optmzaton Neural Networks Ja He Computatonal Intellgence Laboratory School of Computer Scence and Engneerng, UESTC Chengdu, Chna Department of

More information

Design of Structure Optimization with APDL

Design of Structure Optimization with APDL Desgn of Structure Optmzaton wth APDL Yanyun School of Cvl Engneerng and Archtecture, East Chna Jaotong Unversty Nanchang 330013 Chna Abstract In ths paper, the desgn process of structure optmzaton wth

More information

An Entropy-Based Approach to Integrated Information Needs Assessment

An Entropy-Based Approach to Integrated Information Needs Assessment Dstrbuton Statement A: Approved for publc release; dstrbuton s unlmted. An Entropy-Based Approach to ntegrated nformaton Needs Assessment June 8, 2004 Wllam J. Farrell Lockheed Martn Advanced Technology

More information

Comparison of Heuristics for Scheduling Independent Tasks on Heterogeneous Distributed Environments

Comparison of Heuristics for Scheduling Independent Tasks on Heterogeneous Distributed Environments Comparson of Heurstcs for Schedulng Independent Tasks on Heterogeneous Dstrbuted Envronments Hesam Izakan¹, Ath Abraham², Senor Member, IEEE, Václav Snášel³ ¹ Islamc Azad Unversty, Ramsar Branch, Ramsar,

More information

Optimizing Document Scoring for Query Retrieval

Optimizing Document Scoring for Query Retrieval Optmzng Document Scorng for Query Retreval Brent Ellwen baellwe@cs.stanford.edu Abstract The goal of ths project was to automate the process of tunng a document query engne. Specfcally, I used machne learnng

More information

Analysis of Particle Swarm Optimization and Genetic Algorithm based on Task Scheduling in Cloud Computing Environment

Analysis of Particle Swarm Optimization and Genetic Algorithm based on Task Scheduling in Cloud Computing Environment Analyss of Partcle Swarm Optmzaton and Genetc Algorthm based on Tas Schedulng n Cloud Computng Envronment Frederc Nzanywayngoma School of Computer and Communcaton Engneerng Unversty of Scence and Technology

More information

Using Particle Swarm Optimization for Enhancing the Hierarchical Cell Relay Routing Protocol

Using Particle Swarm Optimization for Enhancing the Hierarchical Cell Relay Routing Protocol 2012 Thrd Internatonal Conference on Networkng and Computng Usng Partcle Swarm Optmzaton for Enhancng the Herarchcal Cell Relay Routng Protocol Hung-Y Ch Department of Electrcal Engneerng Natonal Sun Yat-Sen

More information

Application of Improved Fish Swarm Algorithm in Cloud Computing Resource Scheduling

Application of Improved Fish Swarm Algorithm in Cloud Computing Resource Scheduling , pp.40-45 http://dx.do.org/10.14257/astl.2017.143.08 Applcaton of Improved Fsh Swarm Algorthm n Cloud Computng Resource Schedulng Yu Lu, Fangtao Lu School of Informaton Engneerng, Chongqng Vocatonal Insttute

More information

An Image Fusion Approach Based on Segmentation Region

An Image Fusion Approach Based on Segmentation Region Rong Wang, L-Qun Gao, Shu Yang, Yu-Hua Cha, and Yan-Chun Lu An Image Fuson Approach Based On Segmentaton Regon An Image Fuson Approach Based on Segmentaton Regon Rong Wang, L-Qun Gao, Shu Yang 3, Yu-Hua

More information

An Application of the Dulmage-Mendelsohn Decomposition to Sparse Null Space Bases of Full Row Rank Matrices

An Application of the Dulmage-Mendelsohn Decomposition to Sparse Null Space Bases of Full Row Rank Matrices Internatonal Mathematcal Forum, Vol 7, 2012, no 52, 2549-2554 An Applcaton of the Dulmage-Mendelsohn Decomposton to Sparse Null Space Bases of Full Row Rank Matrces Mostafa Khorramzadeh Department of Mathematcal

More information

An Iterative Solution Approach to Process Plant Layout using Mixed Integer Optimisation

An Iterative Solution Approach to Process Plant Layout using Mixed Integer Optimisation 17 th European Symposum on Computer Aded Process Engneerng ESCAPE17 V. Plesu and P.S. Agach (Edtors) 2007 Elsever B.V. All rghts reserved. 1 An Iteratve Soluton Approach to Process Plant Layout usng Mxed

More information

An Efficient Genetic Algorithm with Fuzzy c-means Clustering for Traveling Salesman Problem

An Efficient Genetic Algorithm with Fuzzy c-means Clustering for Traveling Salesman Problem An Effcent Genetc Algorthm wth Fuzzy c-means Clusterng for Travelng Salesman Problem Jong-Won Yoon and Sung-Bae Cho Dept. of Computer Scence Yonse Unversty Seoul, Korea jwyoon@sclab.yonse.ac.r, sbcho@cs.yonse.ac.r

More information

Wishing you all a Total Quality New Year!

Wishing you all a Total Quality New Year! Total Qualty Management and Sx Sgma Post Graduate Program 214-15 Sesson 4 Vnay Kumar Kalakband Assstant Professor Operatons & Systems Area 1 Wshng you all a Total Qualty New Year! Hope you acheve Sx sgma

More information

An Adaptive Multi-population Artificial Bee Colony Algorithm for Dynamic Optimisation Problems

An Adaptive Multi-population Artificial Bee Colony Algorithm for Dynamic Optimisation Problems *Revsed Manuscrpt (changes marked) Clck here to vew lnked References An Adaptve Mult-populaton Artfcal Bee Colony Algorthm for Dynamc Optmsaton Problems Shams K. Nseef 1, Salwan Abdullah 1, Ayad Turky

More information

Review of approximation techniques

Review of approximation techniques CHAPTER 2 Revew of appromaton technques 2. Introducton Optmzaton problems n engneerng desgn are characterzed by the followng assocated features: the objectve functon and constrants are mplct functons evaluated

More information

Parallel matrix-vector multiplication

Parallel matrix-vector multiplication Appendx A Parallel matrx-vector multplcaton The reduced transton matrx of the three-dmensonal cage model for gel electrophoress, descrbed n secton 3.2, becomes excessvely large for polymer lengths more

More information

Machine Learning: Algorithms and Applications

Machine Learning: Algorithms and Applications 14/05/1 Machne Learnng: Algorthms and Applcatons Florano Zn Free Unversty of Bozen-Bolzano Faculty of Computer Scence Academc Year 011-01 Lecture 10: 14 May 01 Unsupervsed Learnng cont Sldes courtesy of

More information

An Accurate Evaluation of Integrals in Convex and Non convex Polygonal Domain by Twelve Node Quadrilateral Finite Element Method

An Accurate Evaluation of Integrals in Convex and Non convex Polygonal Domain by Twelve Node Quadrilateral Finite Element Method Internatonal Journal of Computatonal and Appled Mathematcs. ISSN 89-4966 Volume, Number (07), pp. 33-4 Research Inda Publcatons http://www.rpublcaton.com An Accurate Evaluaton of Integrals n Convex and

More information

Research of Neural Network Classifier Based on FCM and PSO for Breast Cancer Classification

Research of Neural Network Classifier Based on FCM and PSO for Breast Cancer Classification Research of Neural Network Classfer Based on FCM and PSO for Breast Cancer Classfcaton Le Zhang 1, Ln Wang 1, Xujewen Wang 2, Keke Lu 2, and Ajth Abraham 3 1 Shandong Provncal Key Laboratory of Network

More information

Type-2 Fuzzy Non-uniform Rational B-spline Model with Type-2 Fuzzy Data

Type-2 Fuzzy Non-uniform Rational B-spline Model with Type-2 Fuzzy Data Malaysan Journal of Mathematcal Scences 11(S) Aprl : 35 46 (2017) Specal Issue: The 2nd Internatonal Conference and Workshop on Mathematcal Analyss (ICWOMA 2016) MALAYSIAN JOURNAL OF MATHEMATICAL SCIENCES

More information

BioTechnology. An Indian Journal FULL PAPER. Trade Science Inc.

BioTechnology. An Indian Journal FULL PAPER. Trade Science Inc. [Type text] [Type text] [Type text] ISSN : 0974-74 Volume 0 Issue BoTechnology 04 An Indan Journal FULL PAPER BTAIJ 0() 04 [684-689] Revew on Chna s sports ndustry fnancng market based on market -orented

More information

Outline. Type of Machine Learning. Examples of Application. Unsupervised Learning

Outline. Type of Machine Learning. Examples of Application. Unsupervised Learning Outlne Artfcal Intellgence and ts applcatons Lecture 8 Unsupervsed Learnng Professor Danel Yeung danyeung@eee.org Dr. Patrck Chan patrckchan@eee.org South Chna Unversty of Technology, Chna Introducton

More information

Training ANFIS Structure with Modified PSO Algorithm

Training ANFIS Structure with Modified PSO Algorithm Proceedngs of the 5th Medterranean Conference on Control & Automaton, July 7-9, 007, Athens - Greece T4-003 Tranng ANFIS Structure wth Modfed PSO Algorthm V.Seyd Ghomsheh *, M. Alyar Shoorehdel **, M.

More information

Problem Definitions and Evaluation Criteria for the CEC 2005 Special Session on Real-Parameter Optimization

Problem Definitions and Evaluation Criteria for the CEC 2005 Special Session on Real-Parameter Optimization Problem efntons and Evaluaton Crtera for the CEC 2005 Specal Sesson on Real-Parameter Optmzaton P. N. Suganthan, N. Hansen 2, J. J. Lang, K. eb 3, Y. -P. Chen 4, A. Auger 2, S. Twar 3 School of EEE, Nanyang

More information

MULTIOBJECTIVE OPTIMIZATION USING PARALLEL VECTOR EVALUATED PARTICLE SWARM OPTIMIZATION

MULTIOBJECTIVE OPTIMIZATION USING PARALLEL VECTOR EVALUATED PARTICLE SWARM OPTIMIZATION MULTIOBJECTIVE OPTIMIZATION USING PARALLEL VECTOR EVALUATED PARTICLE OPTIMIZATION K.E. Parsopoulos, D.K. Tasouls, M.N. Vrahats Department of Mathematcs, Unversty of Patras Artfcal Intellgence Research

More information

A Simple and Efficient Goal Programming Model for Computing of Fuzzy Linear Regression Parameters with Considering Outliers

A Simple and Efficient Goal Programming Model for Computing of Fuzzy Linear Regression Parameters with Considering Outliers 62626262621 Journal of Uncertan Systems Vol.5, No.1, pp.62-71, 211 Onlne at: www.us.org.u A Smple and Effcent Goal Programmng Model for Computng of Fuzzy Lnear Regresson Parameters wth Consderng Outlers

More information

A Time-driven Data Placement Strategy for a Scientific Workflow Combining Edge Computing and Cloud Computing

A Time-driven Data Placement Strategy for a Scientific Workflow Combining Edge Computing and Cloud Computing > REPLACE THIS LINE WITH YOUR PAPER IDENTIFICATION NUMBER (DOUBLE-CLICK HERE TO EDIT) < 1 A Tme-drven Data Placement Strategy for a Scentfc Workflow Combnng Edge Computng and Cloud Computng Bng Ln, Fangnng

More information

Real-time Motion Capture System Using One Video Camera Based on Color and Edge Distribution

Real-time Motion Capture System Using One Video Camera Based on Color and Edge Distribution Real-tme Moton Capture System Usng One Vdeo Camera Based on Color and Edge Dstrbuton YOSHIAKI AKAZAWA, YOSHIHIRO OKADA, AND KOICHI NIIJIMA Graduate School of Informaton Scence and Electrcal Engneerng,

More information

Solving two-person zero-sum game by Matlab

Solving two-person zero-sum game by Matlab Appled Mechancs and Materals Onlne: 2011-02-02 ISSN: 1662-7482, Vols. 50-51, pp 262-265 do:10.4028/www.scentfc.net/amm.50-51.262 2011 Trans Tech Publcatons, Swtzerland Solvng two-person zero-sum game by

More information

Available online at Available online at Advanced in Control Engineering and Information Science

Available online at   Available online at   Advanced in Control Engineering and Information Science Avalable onlne at wwwscencedrectcom Avalable onlne at wwwscencedrectcom Proceda Proceda Engneerng Engneerng 00 (2011) 15000 000 (2011) 1642 1646 Proceda Engneerng wwwelsevercom/locate/proceda Advanced

More information

A hybrid sequential approach for data clustering using K-Means and particle swarm optimization algorithm

A hybrid sequential approach for data clustering using K-Means and particle swarm optimization algorithm MultCraft Internatonal Journal of Engneerng, Scence and Technology Vol., No. 6, 00, pp. 67-76 INTERNATIONAL JOURNAL OF ENGINEERING, SCIENCE AND TECHNOLOGY www.jest-ng.com 00 MultCraft Lmted. All rghts

More information

Cracking of the Merkle Hellman Cryptosystem Using Genetic Algorithm

Cracking of the Merkle Hellman Cryptosystem Using Genetic Algorithm Crackng of the Merkle Hellman Cryptosystem Usng Genetc Algorthm Zurab Kochladze 1 * & Lal Besela 2 1 Ivane Javakhshvl Tbls State Unversty, 1, I.Chavchavadze av 1, 0128, Tbls, Georga 2 Sokhum State Unversty,

More information

Range images. Range image registration. Examples of sampling patterns. Range images and range surfaces

Range images. Range image registration. Examples of sampling patterns. Range images and range surfaces Range mages For many structured lght scanners, the range data forms a hghly regular pattern known as a range mage. he samplng pattern s determned by the specfc scanner. Range mage regstraton 1 Examples

More information

Whale swarm algorithm for function optimization

Whale swarm algorithm for function optimization Whale swarm algorthm for functon optmzaton Bng Zeng School of Mechancal Scence and Engneerng Huazhong Unversty of Scence and Technology Wuhan, Chna zengbng6@6.com Lang Gao* School of Mechancal Scence and

More information

Edge Detection in Noisy Images Using the Support Vector Machines

Edge Detection in Noisy Images Using the Support Vector Machines Edge Detecton n Nosy Images Usng the Support Vector Machnes Hlaro Gómez-Moreno, Saturnno Maldonado-Bascón, Francsco López-Ferreras Sgnal Theory and Communcatons Department. Unversty of Alcalá Crta. Madrd-Barcelona

More information

Tsinghua University at TAC 2009: Summarizing Multi-documents by Information Distance

Tsinghua University at TAC 2009: Summarizing Multi-documents by Information Distance Tsnghua Unversty at TAC 2009: Summarzng Mult-documents by Informaton Dstance Chong Long, Mnle Huang, Xaoyan Zhu State Key Laboratory of Intellgent Technology and Systems, Tsnghua Natonal Laboratory for

More information

Imperialist Competitive Algorithm with Variable Parameters to Determine the Global Minimum of Functions with Several Arguments

Imperialist Competitive Algorithm with Variable Parameters to Determine the Global Minimum of Functions with Several Arguments Fourth Internatonal Conference Modellng and Development of Intellgent Systems October 8 - November, 05 Lucan Blaga Unversty Sbu - Romana Imperalst Compettve Algorthm wth Varable Parameters to Determne

More information

The Codesign Challenge

The Codesign Challenge ECE 4530 Codesgn Challenge Fall 2007 Hardware/Software Codesgn The Codesgn Challenge Objectves In the codesgn challenge, your task s to accelerate a gven software reference mplementaton as fast as possble.

More information

An Influence of the Noise on the Imaging Algorithm in the Electrical Impedance Tomography *

An Influence of the Noise on the Imaging Algorithm in the Electrical Impedance Tomography * Open Journal of Bophyscs, 3, 3, 7- http://dx.do.org/.436/ojbphy.3.347 Publshed Onlne October 3 (http://www.scrp.org/journal/ojbphy) An Influence of the Nose on the Imagng Algorthm n the Electrcal Impedance

More information

Data Mining For Multi-Criteria Energy Predictions

Data Mining For Multi-Criteria Energy Predictions Data Mnng For Mult-Crtera Energy Predctons Kashf Gll and Denns Moon Abstract We present a data mnng technque for mult-crtera predctons of wnd energy. A mult-crtera (MC) evolutonary computng method has

More information

K-means Optimization Clustering Algorithm Based on Hybrid PSO/GA Optimization and CS validity index

K-means Optimization Clustering Algorithm Based on Hybrid PSO/GA Optimization and CS validity index Orgnal Artcle Prnt ISSN: 3-6379 Onlne ISSN: 3-595X DOI: 0.7354/jss/07/33 K-means Optmzaton Clusterng Algorthm Based on Hybrd PSO/GA Optmzaton and CS valdty ndex K Jahanbn *, F Rahmanan, H Rezae 3, Y Farhang

More information

Torusity Tolerance Verification using Swarm Intelligence

Torusity Tolerance Verification using Swarm Intelligence IEMS Vol. 6, No., pp. 94-15, December 7. Torusty Tolerance Verfcaton usng Swarm Intellgence Chakguy Prakasvudhsarn Industral Engneerng Program, Srndhorn Internatonal Insttute of Technology Thammasat Unversty,

More information

Quality Improvement Algorithm for Tetrahedral Mesh Based on Optimal Delaunay Triangulation

Quality Improvement Algorithm for Tetrahedral Mesh Based on Optimal Delaunay Triangulation Intellgent Informaton Management, 013, 5, 191-195 Publshed Onlne November 013 (http://www.scrp.org/journal/m) http://dx.do.org/10.36/m.013.5601 Qualty Improvement Algorthm for Tetrahedral Mesh Based on

More information

Straight Line Detection Based on Particle Swarm Optimization

Straight Line Detection Based on Particle Swarm Optimization Sensors & ransducers 013 b IFSA http://www.sensorsportal.com Straght Lne Detecton Based on Partcle Swarm Optmzaton Shengzhou XU, Jun IE College of computer scence, South-Central Unverst for Natonaltes,

More information

K-means and Hierarchical Clustering

K-means and Hierarchical Clustering Note to other teachers and users of these sldes. Andrew would be delghted f you found ths source materal useful n gvng your own lectures. Feel free to use these sldes verbatm, or to modfy them to ft your

More information

Optimization of integrated circuits by means of simulated annealing. Jernej Olenšek, Janez Puhan, Árpád Bűrmen, Sašo Tomažič, Tadej Tuma

Optimization of integrated circuits by means of simulated annealing. Jernej Olenšek, Janez Puhan, Árpád Bűrmen, Sašo Tomažič, Tadej Tuma Optmzaton of ntegrated crcuts by means of smulated annealng Jernej Olenšek, Janez Puhan, Árpád Bűrmen, Sašo Tomažč, Tadej Tuma Unversty of Ljubljana, Faculty of Electrcal Engneerng, Tržaška 25, Ljubljana,

More information

Biostatistics 615/815

Biostatistics 615/815 The E-M Algorthm Bostatstcs 615/815 Lecture 17 Last Lecture: The Smplex Method General method for optmzaton Makes few assumptons about functon Crawls towards mnmum Some recommendatons Multple startng ponts

More information

X- Chart Using ANOM Approach

X- Chart Using ANOM Approach ISSN 1684-8403 Journal of Statstcs Volume 17, 010, pp. 3-3 Abstract X- Chart Usng ANOM Approach Gullapall Chakravarth 1 and Chaluvad Venkateswara Rao Control lmts for ndvdual measurements (X) chart are

More information

Machine Learning 9. week

Machine Learning 9. week Machne Learnng 9. week Mappng Concept Radal Bass Functons (RBF) RBF Networks 1 Mappng It s probably the best scenaro for the classfcaton of two dataset s to separate them lnearly. As you see n the below

More information

The Greedy Method. Outline and Reading. Change Money Problem. Greedy Algorithms. Applications of the Greedy Strategy. The Greedy Method Technique

The Greedy Method. Outline and Reading. Change Money Problem. Greedy Algorithms. Applications of the Greedy Strategy. The Greedy Method Technique //00 :0 AM Outlne and Readng The Greedy Method The Greedy Method Technque (secton.) Fractonal Knapsack Problem (secton..) Task Schedulng (secton..) Mnmum Spannng Trees (secton.) Change Money Problem Greedy

More information

User Authentication Based On Behavioral Mouse Dynamics Biometrics

User Authentication Based On Behavioral Mouse Dynamics Biometrics User Authentcaton Based On Behavoral Mouse Dynamcs Bometrcs Chee-Hyung Yoon Danel Donghyun Km Department of Computer Scence Department of Computer Scence Stanford Unversty Stanford Unversty Stanford, CA

More information