ARTICLE IN PRESS. Applied Soft Computing xxx (2012) xxx xxx. Contents lists available at SciVerse ScienceDirect. Applied Soft Computing

Similar documents
Problem Definitions and Evaluation Criteria for Computational Expensive Optimization

Cluster Analysis of Electrical Behavior

EVALUATION OF THE PERFORMANCES OF ARTIFICIAL BEE COLONY AND INVASIVE WEED OPTIMIZATION ALGORITHMS ON THE MODIFIED BENCHMARK FUNCTIONS

Clustering Algorithm Combining CPSO with K-Means Chunqin Gu 1, a, Qian Tao 2, b

Complexity Analysis of Problem-Dimension Using PSO

Parallelism for Nested Loops with Non-uniform and Flow Dependences

Meta-heuristics for Multidimensional Knapsack Problems

Determining the Optimal Bandwidth Based on Multi-criterion Fusion

Learning the Kernel Parameters in Kernel Minimum Distance Classifier

CHAPTER 2 PROPOSED IMPROVED PARTICLE SWARM OPTIMIZATION

Optimal Design of Nonlinear Fuzzy Model by Means of Independent Fuzzy Scatter Partition

NUMERICAL SOLVING OPTIMAL CONTROL PROBLEMS BY THE METHOD OF VARIATIONS

Content Based Image Retrieval Using 2-D Discrete Wavelet with Texture Feature with Different Classifiers

Classifier Swarms for Human Detection in Infrared Imagery

A Binarization Algorithm specialized on Document Images and Photos

Support Vector Machines

Network Intrusion Detection Based on PSO-SVM

Improvement of Spatial Resolution Using BlockMatching Based Motion Estimation and Frame. Integration

An Optimal Algorithm for Prufer Codes *

Subspace clustering. Clustering. Fundamental to all clustering techniques is the choice of distance measure between data points;

Recommended Items Rating Prediction based on RBF Neural Network Optimized by PSO Algorithm

Classifier Selection Based on Data Complexity Measures *

Multi-objective Optimization Using Self-adaptive Differential Evolution Algorithm

A Clustering Algorithm Solution to the Collaborative Filtering

Journal of Chemical and Pharmaceutical Research, 2014, 6(6): Research Article. A selective ensemble classification method on microarray data

A mathematical programming approach to the analysis, design and scheduling of offshore oilfields

Performance Evaluation of Information Retrieval Systems

Smoothing Spline ANOVA for variable screening

SLAM Summer School 2006 Practical 2: SLAM using Monocular Vision

The Research of Support Vector Machine in Agricultural Data Classification

Problem Definitions and Evaluation Criteria for the CEC 2015 Competition on Learning-based Real-Parameter Single Objective Optimization

Virtual Machine Migration based on Trust Measurement of Computer Node

Load Balancing for Hex-Cell Interconnection Network

A Notable Swarm Approach to Evolve Neural Network for Classification in Data Mining

An Improved Particle Swarm Optimization for Feature Selection

NGPM -- A NSGA-II Program in Matlab

A MOVING MESH APPROACH FOR SIMULATION BUDGET ALLOCATION ON CONTINUOUS DOMAINS

Collaboratively Regularized Nearest Points for Set Based Recognition

Optimizing SVR using Local Best PSO for Software Effort Estimation

CHAPTER 4 OPTIMIZATION TECHNIQUES

An Efficient Genetic Algorithm Based Approach for the Minimum Graph Bisection Problem

THE PATH PLANNING ALGORITHM AND SIMULATION FOR MOBILE ROBOT

A New Approach For the Ranking of Fuzzy Sets With Different Heights

Maximum Variance Combined with Adaptive Genetic Algorithm for Infrared Image Segmentation

A Load-balancing and Energy-aware Clustering Algorithm in Wireless Ad-hoc Networks

Natural Computing. Lecture 13: Particle swarm optimisation INFR /11/2010

A Fast Visual Tracking Algorithm Based on Circle Pixels Matching

A Novel Deluge Swarm Algorithm for Optimization Problems

Support Vector Machines

Chinese Word Segmentation based on the Improved Particle Swarm Optimization Neural Networks

Design of Structure Optimization with APDL

An Entropy-Based Approach to Integrated Information Needs Assessment

Comparison of Heuristics for Scheduling Independent Tasks on Heterogeneous Distributed Environments

Optimizing Document Scoring for Query Retrieval

Analysis of Particle Swarm Optimization and Genetic Algorithm based on Task Scheduling in Cloud Computing Environment

Using Particle Swarm Optimization for Enhancing the Hierarchical Cell Relay Routing Protocol

Application of Improved Fish Swarm Algorithm in Cloud Computing Resource Scheduling

An Image Fusion Approach Based on Segmentation Region

An Application of the Dulmage-Mendelsohn Decomposition to Sparse Null Space Bases of Full Row Rank Matrices

An Iterative Solution Approach to Process Plant Layout using Mixed Integer Optimisation

An Efficient Genetic Algorithm with Fuzzy c-means Clustering for Traveling Salesman Problem

Wishing you all a Total Quality New Year!

An Adaptive Multi-population Artificial Bee Colony Algorithm for Dynamic Optimisation Problems

Review of approximation techniques

Parallel matrix-vector multiplication

Machine Learning: Algorithms and Applications

An Accurate Evaluation of Integrals in Convex and Non convex Polygonal Domain by Twelve Node Quadrilateral Finite Element Method

Research of Neural Network Classifier Based on FCM and PSO for Breast Cancer Classification

Type-2 Fuzzy Non-uniform Rational B-spline Model with Type-2 Fuzzy Data

BioTechnology. An Indian Journal FULL PAPER. Trade Science Inc.

Outline. Type of Machine Learning. Examples of Application. Unsupervised Learning

Training ANFIS Structure with Modified PSO Algorithm

Problem Definitions and Evaluation Criteria for the CEC 2005 Special Session on Real-Parameter Optimization

MULTIOBJECTIVE OPTIMIZATION USING PARALLEL VECTOR EVALUATED PARTICLE SWARM OPTIMIZATION

A Simple and Efficient Goal Programming Model for Computing of Fuzzy Linear Regression Parameters with Considering Outliers

A Time-driven Data Placement Strategy for a Scientific Workflow Combining Edge Computing and Cloud Computing

Real-time Motion Capture System Using One Video Camera Based on Color and Edge Distribution

Solving two-person zero-sum game by Matlab

Available online at Available online at Advanced in Control Engineering and Information Science

A hybrid sequential approach for data clustering using K-Means and particle swarm optimization algorithm

Cracking of the Merkle Hellman Cryptosystem Using Genetic Algorithm

Range images. Range image registration. Examples of sampling patterns. Range images and range surfaces

Whale swarm algorithm for function optimization

Edge Detection in Noisy Images Using the Support Vector Machines

Tsinghua University at TAC 2009: Summarizing Multi-documents by Information Distance

Imperialist Competitive Algorithm with Variable Parameters to Determine the Global Minimum of Functions with Several Arguments

The Codesign Challenge

An Influence of the Noise on the Imaging Algorithm in the Electrical Impedance Tomography *

Data Mining For Multi-Criteria Energy Predictions

K-means Optimization Clustering Algorithm Based on Hybrid PSO/GA Optimization and CS validity index

Torusity Tolerance Verification using Swarm Intelligence

Quality Improvement Algorithm for Tetrahedral Mesh Based on Optimal Delaunay Triangulation

Straight Line Detection Based on Particle Swarm Optimization

K-means and Hierarchical Clustering

Optimization of integrated circuits by means of simulated annealing. Jernej Olenšek, Janez Puhan, Árpád Bűrmen, Sašo Tomažič, Tadej Tuma

Biostatistics 615/815

X- Chart Using ANOM Approach

Machine Learning 9. week

The Greedy Method. Outline and Reading. Change Money Problem. Greedy Algorithms. Applications of the Greedy Strategy. The Greedy Method Technique

User Authentication Based On Behavioral Mouse Dynamics Biometrics

Transcription:

ASOC-11; o. of Pages 1 Appled Soft Computng xxx (1) xxx xxx Contents lsts avalable at ScVerse ScenceDrect Appled Soft Computng j ourna l ho mepage: www.elsever.com/locate/asoc A herarchcal partcle swarm optmzer wth latn samplng based memetc algorthm for numercal optmzaton Yong Peng a,, Bao-Lang Lu a,b a Center for Bran-lke Computng and Machne Intellgence, Department of Computer Scence and Engneerng, Shangha Jao Tong Unversty, Shangha, PR Chna b MoE-Mcrosoft Key Laboratory for Intellgent Computng and Intellgent Systems, Shangha Jao Tong Unversty, Shangha, PR Chna a r t c l e n f o Artcle hstory: Receved 31 December 11 Receved n revsed form 13 May 1 Accepted 1 May 1 Avalable onlne xxx Keywords: Memetc algorthm Partcle swarm optmzer Latn hypercube samplng Comprehensve learnng Cylndrcty a b s t r a c t Memetc algorthms, one type of algorthms nspred by nature, have been successfully appled to solve numerous optmzaton problems n dverse felds. In ths paper, we propose a new memetc computng model, usng a herarchcal partcle swarm optmzer (HPSO) and latn hypercube samplng (LHS) method. In the bottom layer of herarchcal PSO, several swarms evolve n parallel to avod beng trapped n local optma. The learnng strategy for each swarm s the well-known comprehensve learnng method wth a newly desgned mutaton operator. After the evoluton process accomplshed n bottom layer, one partcle for each swarm s selected as canddate to construct the swarm n the top layer, whch evolves by the same strategy employed n the bottom layer. The local search strategy based on LHS s mposed on partcles n the top layer every specfed number of generatons. The new memetc computng model s extensvely evaluated on a sute of 1 numercal optmzaton functons as well as the cylndrcty error evaluaton problem. Expermental results show that the proposed algorthm compares favorably wth conventonal PSO and several varants. 1 Elsever B.V. All rghts reserved. 1. Introducton Optmzaton has been a research hotspot for several decades. Many real-world optmzaton problems n engneerng are becomng ncreasngly complcated, so optmzaton algorthms wth hgh performance are needed [1,]. Unconstraned optmzaton problems can be formulated as D-dmensonal optmzaton problems over contnuous space mn f (x), x = [x 1, x,..., x D ] (1) Evolutonary algorthms, nspred by natural evoluton, have been wdely used as effectve tools to solve optmzaton problems. One class of nature nspred algorthms are swarm ntellgent algorthms. Partcle swarm optmzer (PSO) [3,] has attracted attenton n the academc and ndustral communty. Although PSO shares many smlartes wth evolutonary algorthms, the orgnal PSO does not use the tradtonal evoluton operators such as crossover and mutaton. PSO draws on the swarm behavor of brds flockng where they search for food n a collaboratve way. Each member, n the swarm, called a partcle, represents a potental soluton to the target problem and t adapts ts search patterns by learnng from ts own experence and other members experence. The partcle s a pont n the search space and t ams at fndng the global optmum Correspondng author. E-mal address: StanY.Peng@gmal.com (Y. Peng). whch s regarded as the locaton of food. Each partcle has two attrbutes called poston and velocty and ts drecton of flght s adjusted accordng to the experences of the swarm. The swarm as a whole searches for the global optmum n D-dmensonal feasblty space. The PSO algorthm s easy to understand and mplement, and has been proved to perform well on many optmzaton problems. However, t may easly get trapped n a local optmum for many reasons, such as the lack of dversty among partcles and overlearnng from the best partcle found so far. To mprove PSO s performance on complex numercal optmzaton problems, we propose a herarchcal PSO framework, n whch several swarms evolve n parallel towards the global optmum and we desgn a new mutaton operator to ncrease the dversty of swarms. After evolvng for a specfed number of generatons, a latn hypercube samplng method s used to execute the local search. Ths paper s organzed as follows. Secton ntroduces the orgnal PSO and some varants. Secton 3 descrbes the proposed herarchcal PSO wth latn samplng based memetc algorthm, ncludng four subsectons: herarchcal PSO framework, mutaton strategy, latn hypercube samplng based local search strategy and the overall framework of the proposed memetc algorthm. Secton gves the expermental results, descrbes the related parameter tunng process and compares the performance of the proposed algorthm on a sute of test problems to that of other PSO varants. Secton 5 gves conclusons and descrbes future work. 15-9/$ see front matter 1 Elsever B.V. All rghts reserved. http://dx.do.org/1.11/j.asoc.1.5. for numercal optmzaton, Appl. Soft Comput. J. (1), http://dx.do.org/1.11/j.asoc.1.5.

ASOC-11; o. of Pages 1 Y. Peng, B.-L. Lu / Appled Soft Computng xxx (1) xxx xxx Fg. 1. Orgnal partcle swarm optmzer.. Partcle swarm optmzers.1. Orgnal PSO PSO s a stochastc optmzaton algorthm whch smulates swarm behavor. The ndvduals move over a specfed D- dmensonal feasble space. As n a the genetc algorthm, the partcles n PSO are ntalzed wth random veloctes and postons. The algorthm adaptvely updates the velocty and poston of each partcle n the swarm by learnng from the good experences. In the orgnal PSO [3], the velocty V d and poston X d of the dth dmenson of the th partcle are updated as follows. V d : = V d + c 1 rand1 d (pbest d X d ) + c rand d (gbest d X d ) X d := X d + V d () where X = (X 1, X,..., X D ) s the poston of the th partcle and V = (V 1, V,..., V D ) represents velocty of partcle, pbest = (pbest 1, pbest,..., pbest D ) s the best prevous poston yeldng the best ftness value for the th partcle, gbest = (gbest 1, gbest,..., gbest D ) s the best poston found so far over the whole swarm, c 1 and c are the acceleraton constants, reflectng the weghtng of stochastc acceleraton terms that pull each partcle towards pbest and gbest postons, respectvely. rand1 d and rand d are two random numbers n the range [,1]. A partcle s velocty on each dmenson s confned to a maxmum magntude V max. If V d exceeds a pre-specfed postve constant value Vmax d, then the velocty on the dmenson s assgned to sgn( V d )V max d. The framework of the orgnal PSO s shown n Fg. 1. From the flow of the teratve process, we can fnd that each partcle fles to the global best partcle n the swarm; ths leads to a severe drawback of overlearnng from the best partcle. Consequently, the dversty of the whole swarm wll drop down dramatcally. If the best partcle does not share the same nche wth the global optmum, the partcles may easly get trapped n a local optmum. Snce PSO s ntroducton n 1995, many researchers have worked on mprovng ts performance n varous ways and many more effectve varants have been proposed; these wll be dscussed n next subsecton... Some varants of PSO Ths secton gves a bref survey of several PSO varants proposed n recent years. Sh and Eberhart [5] ntroduced nerta weght w nto the orgnal PSO algorthm, so the crteron for updatng the velocty was changed to V d := w V d + c 1 rand1 d (pbest d X d ) + c rand d (gbest d X d ). They ndcated that the nerta weght plays an mportant role n balancng the global and local search abltes; a large nerta weght encourages global search whle a small nerta weght encourages local search. Based on ths dea, the nerta weght s usually set to decrease lnearly over teratons. Dfferent types of topologes have been desgned to mprove PSO s performance n solvng dfferent optmzaton problems. Kennedy [,7] clamed that PSO wth a small neghborhood mght perform better on complex problems, whle PSO wth large neghborhood would perform better on smple problems. Suganthan [] defned the neghborhood of a partcle as the several nearest partcles n each teraton so that a dynamc neghborhood s computatonally ntensve. Jan et al. [9] examned several (3) for numercal optmzaton, Appl. Soft Comput. J. (1), http://dx.do.org/1.11/j.asoc.1.5.

ASOC-11; o. of Pages 1 Y. Peng, B.-L. Lu / Appled Soft Computng xxx (1) xxx xxx 3 neghborhood topologes. The unfed PSO () proposed by Parsopoulos and Vrahats [1] combned the global verson and local verson of the orgnal PSO. Mendes et al. [11] used all the neghbors of the partcle to update the velocty nstead of the pbest and the gbest. The neghbors of each partcle were selected based on ts ftness value and the sze of neghborhood. Peram et al. [1] proposed the ftness-dstance-rato-based PSO (FDR-PSO). When updatng each velocty dmenson, the FDR-PSO algorthm selects one other partcle nbest, whch has a hgher ftness value and s nearer to the partcle beng updated. In comprehensve learnng PSO () [13], the velocty of each dmenson s nfluenced by pbest of every other partcle, whch ncreases the dversty of the swarm for multmodal optmzaton problems. In [1], several subswarms were used to coevolve wth each other. The entre populaton was shuffled at perodc stages and subswarms were reassgned. Yang and L [15] developed a herarchcal clusterng method to partton the orgnal swarm nto several subswarms, whch locate and track multple optma n dynamc envronments. Wang et al. [1] proposed a memetc algorthm based on a partcle swarm optmzer wth a rng-shaped topology; later, he mproved hs algorthm by parttonng partcles n the rng-shaped topology structure nto several speces whch can update nformaton n parallel [17]. Chen [1] proposed a two-layer PSO () for unconstraned optmzaton problems, where each subswarm was made to evolve based on the orgnal PSO. Although the orgnal PSO does not use the tradtonal evoluton operators such as crossover and mutaton, researchers ntroduced some other search technques ncludng evolutonary operators nto PSO to mprove ts performance. Evolutonary operators such as crossover, mutaton and selecton were used n [19 1]. In Ref. [], deflecton, stretchng and repulson technques are used to fnd as many mnma as possble by preventng partcles from movng to a prevously dscovered mnmal regon. Cooperatve PSO (CPSO-H) [3] uses one-dmensonal swarms to search each dmenson separately. In recent years, many advanced operators have been ntroduced to mprove PSO s performance. Lng et al. [] employed a wavelet-theory-based mutaton operaton to enhance PSO n explorng the soluton space more effectvely. Zhao [5] proposed a perturbed PSO (ppso) algorthm whch ntroduced the perturbed global best to deal wth the problem of premature convergence and dversty mantenance wthn the swarm. Gao et al. [] ncorporated the Henon map based mutaton operator, whch dvded the mutaton operator nto global and local mutaton operators; ths enabled the partcles to have a stronger exploraton ablty and fast convergence rate. Although many varants of PSO have been proposed, all of whch enhance the performance of orgnal PSO to some extent, the effectveness of these varants n dealng wth dverse problems wth dfferent characterstcs s stll unsatsfyng. For example, s performance on ll-condtoned problems s poor and an algorthm [7] wth hgh convergence speed s prone to shrnk towards local optma. So takng measures ncludng model structure, velocty updatng strategy, and the hybrd operators smultaneously accordng to the partcles behavor to mprove ts performance may be a feasble path to get a satsfactory result over dverse numercal optmzaton problems. 3. The proposed memetc algorthm In ths secton, we ntroduce the proposed memetc algorthm n detal; t s based on a herarchcal PSO framework and some search technques ncludng a local search strategy called the latn hypercube samplng method and a hybrd mutaton strategy. gbest 1 swarm 1 M swarm gbest Top layer gbest M swarm M Fg.. The archtecture of two-layer herarchcal PSO. 3.1. The herarchcal partcle swarm optmzer Bottom layer There are two versons of PSO, global and local, accordng to the approach of choosng gbest. In the global verson, each partcle can be nfluenced by the partcle wth best ftness n the whole swarm, whch causes all the partcles to move and converge quckly on one optmum pont n the search space. By contrast, the local verson only allows a partcle to be nfluenced by the best ftness partcle from ts neghborhood, whch makes the algorthm exhbt a good exploraton capacty because the populaton can slowly converge to the optmal space. Recently, many algorthms have been proposed to partton the populaton nto several subswarms based on Eucldean dstance [], ftness value [9] and some other metrcs [15,17]. These subswarms are dfferent defntons of the neghborhood, and each partcle can only nteract wth partcles n ts neghborhood to avod convergng too fast. Obvously, computng the Eucldean dstance s tme-consumng when the dmenson s hgh; ndvduals wth smlar ftness values whch are prone to be classfed nto the same group may be n dfferent nches. And the speces formaton method [17] s complcated and partally depends on the dstance of partcles. Here, we propose a two-layer herarchcal PSO model. There are M swarms n the bottom layer wth partcles n each swarm and only one swarm n the top layer. Fg. gves the archtecture of the herarchcal PSO. For each swarm n the bottom layer, partcles move towards the optmum based on the comprehensve learnng method [13] descrbed below, whch s a typcal local verson of PSO. After each teraton, M swarms n the bottom layer wll generate M best partcles whch wll stand chances nto the top layer. So n the top layer, the number of partcles s dentcal to the number of swarms n the bottom layer and they are traned by comprehensve learnng as well. The reasons for the selectng herarchcal PSO can be stated as follows. Frst, several swarms evolvng n parallel can have a good chance to reach the global optmum even f some of them stagnate n local optma. Second, the swarms are generated randomly whch saves tme n computng the neghborhood based on Eucldean dstance. Though smple, ths approach mght be effectve. For ths model, the movement of partcles n the bottom layer s smlar to the local search and the movement of partcles n the top layer s smlar to the global search. The best partcle n the top layer can nfluence partcles n the bottom layer ndrectly so that the speed of convergence wll slow down. So ths model can work for both explotaton and exploraton smultaneously. The comprehensve learnng method [13] used to tran partcles n the herarchcal PSO modal, s specfcally desgned for complex multmodal problems. Smply speakng, desgns a set of exemplars pbset d f(d) for each partcle to update ts velocty nstead of the tradtonal pbest and gbest, whch enlarges the search scope and enhances the performance of local search. Fg. 3 gves the flow of the comprehensve learnng method. for numercal optmzaton, Appl. Soft Comput. J. (1), http://dx.do.org/1.11/j.asoc.1.5.

ASOC-11; o. of Pages 1 Y. Peng, B.-L. Lu / Appled Soft Computng xxx (1) xxx xxx Fg. 3. Comprehensve learnng PSO. 3.. Mutaton strategy Most varants of PSO adopt strateges to update the old velocty vector based on the partcles n neghborhood, so they have dffculty n adaptng quckly to the dfferent optmzaton stages of ll-condtoned problems. In ths subsecton, we propose a new mutaton operator, nspred by the mutaton operaton n Dfferental Evoluton (DE) [3]. It updates the partcles postons based on the dfferental nformaton and the pbest. The mutaton operator can be formulated as X d : = c (X d k X d j ) + c (pbest d X d ); c (.5,.); () where X d k and X d are the dth varables of two randomly selected j other partcles, (.5,.) represents the Gaussan dstrbuton wth mean.5 and standard devaton.. We carry out the mutaton operaton after updatng the pbest and gbest n both the bottom layer and the top layer based on the probablty Pm except the best partcle n each swarm. Ths operator wll generate a dsturbance when partcles poston are close to local optma. 3.3. Local search based on latn samplng Latn hypercube samplng, whch was proposed by Mckay [31], s a stratfed samplng approach. Ths paper employs ths samplng method to explot the excellent subspace whch has been found at present. Suppose that V s a hypercube wth dmenson n, of whch each dmenson x s denoted as [x l, xu ]( = 1,,..., n, x l and xu are the lower bound and the upper bound of dmenson, respectvely), then the algorthm of generatng H samples n ths hypercube V s Fg.. Here, a smple nstance s provded to demonstrate the Latn Samplng Process n detal. If the dmenson of the hypercube s two and the samplng scale s eght, then a satsfactory samplng matrx A s formed. [ ] T 3 7 5 1 A = (5) 1 5 3 7 Fg.. Latn samplng process n hypercube. for numercal optmzaton, Appl. Soft Comput. J. (1), http://dx.do.org/1.11/j.asoc.1.5.

ASOC-11; o. of Pages 1 Y. Peng, B.-L. Lu / Appled Soft Computng xxx (1) xxx xxx 5 9 7 5 Latn Hypercube Samplng n Dmensonal Space operator s employed to mantan the dversty of the swarms. To more explctly descrbe the proposed algorthm, the complete flow chat of MA-HPSOL s gven n Fg. 7. In the next secton, a large number of test problems are used to evaluate the performance of the proposed algorthm. Suppose that the computaton cost of one partcle n the approach s c, the cost of the mutaton operator s c m and the cost of Latn local search s c l, then the total computaton cost of MA-HPSOL for one generaton s (M + 1)(c + c m ) + M(c l ). But when solvng real-world problems, usually the ftness evaluaton accounts for the most tme as the PSO s hghly computatonally effcent. So the algorthm-related computaton tmes are not gven n ths paper. 3. Expermental study 1 1 3 5 7 9 Fg. 5. Latn hypercube samplng n dmensonal space. and the correspondng samples n the hypercube s showed n Fg. 5. Latn hypercube samplng can be vewed as a space-fllng desgn, whch means that one and only one sample s selected n each row or column of each sub hypercube. So the samples generated by latn samplng are dstrbuted unformly n the hypercube space and ths s helpful to mantan the dversty of populaton. 3.. The proposed memetc algorthm In ths secton, we ntroduce the herarchcal PSO wth latn hypercube samplng based memetc algorthm (MA-HPSOL) as whole. Fg. gves the overall framework of the proposed algorthm. From Fg., we know that herarchcal PSO s the man framework of the proposed memetc algorthm. Swarms n the framework are traned by the comprehensve learnng method. The latn hypercube samplng based local search s performed every ten teratons. Furthermore, a dfferental nformaton based mutaton In ths secton, we evaluate the performance of MA-HPSOL by solvng 1 numercal optmzaton problems ncludng eght conventonal unmodal and multmodal benchmarks,sx rotated benchmarks and two composton problems. The test problems are scalable to any number of varables, so we manly employ the test problems wth 1 and 3 varables. We wll compare MA-HPSOL wth PSO wth nerta weght [5], [1], FDR-PSO [1], [13] and [1]..1. Test functons In ths subsecton, we choose 1 functon optmzaton problems to demonstrate the effectveness of the proposed MA-HPSOL algorthm. They can be classfed nto four types: unmodal, multmodal, rotated and composte problems. Table 1 tabulates the benchmark test functons wth ther notable characterstcs. The detaled characterstcs of these test functons can be found n [3]... Senstvty n relaton to parameters For the proposed MA-HPSOL algorthm, there are four parameters: M (the number of swarms n the bottom layer), (the number Fg.. The proposed memetc algorthm (MA-HPSOL). for numercal optmzaton, Appl. Soft Comput. J. (1), http://dx.do.org/1.11/j.asoc.1.5.

ASOC-11; o. of Pages 1 Y. Peng, B.-L. Lu / Appled Soft Computng xxx (1) xxx xxx V pbest gbest k 1 gbest 1 gbest gbest M M gbests M M k k+1 mod(k,1)== Y gbest k Gen Y Fg. 7. Flow chat of proposed algorthm (MA-HPSOL). of partcles n each swarm), the samplng scale p and the length of each dmenson ı of the hypercube. Senstvty n relaton to M and. The expermental results of MA-HPSOL n optmzng functons 1,, and 1 wth the number of swarms M ncreased from to 1 n steps of 1 and the number of partcles n each swarm ncreased from to 1 n steps of 1. The values of other parameters are as follows: the samplng scale s 1, the length of each dmenson ı of the hypercube s twce the length of the correspondng dmenson of the selected partcle and the mutaton probablty s the nverse of dmensonalty D (here only D = 1 s taken nto consderaton). The maxmum number of functon evaluatons s set at 1,. The data are statstcal average values of the number of functon evaluatons obtaned from 3 ndependent runs. The results are shown n Fg.. From the expermental results of several test functons (functons 1,, and 1) depcted n Fg., we can easly fnd that small values of M and are encouraged by MA-HPSOL. Therefore, the number of swarms M and the number of partcles n each swarm are both set to 3 n all followng experments. Senstvty to the length of each dmenson ı of the hypercube. How to get a proper neghborhood for explotaton f a partcle has been selected to execute the local search? Because each dmenson of the partcle may be dfferent from others, the length of each dmenson ı of the hypercube should be adaptve to the selected partcles. Here, we propose a smple method to specfy ı whch shows excellent performance n our experments. The length of each dmenson ı of the hypercube s twce the length of the correspondng dmenson of the selected partcle. Senstvty to samplng scale p. It s dffcult to choose a proper samplng scale p because f we choose a bgger value, the generatons for evoluton wll be few and f we choose a smaller value, the neghborhood of a selected partcle may not be exploted suffcently (f the maxmum number of functon evaluatons s fxed). So we should get a balance between the samplng scale p and the generatons of evoluton. The expermental results of MA-HPSOL n optmzng functons and 1 wth the samplng scale p from 5 to 3 n steps of 5 are shown n Fg. 9. Both the number of swarms M and the number of partcles are set to 3, and other parameters are the same as mentoned above. From the expermental results of functons and 1 depcted n Fg. 9, t s obvous that MA-HPSOL gets better results when the samplng scale s set to 5, 1 and 15. Therefore, the samplng scale p n all followng experments s set to 1. So all the parameters for MA-HPSOL are shown n Table, where ı s set to means that the length of each dmenson ı of the hypercube s two tmes the length of the correspondng dmenson of the selected partcle and MAXFES stands for the maxmum number of functon evaluatons (1, dmenson). for numercal optmzaton, Appl. Soft Comput. J. (1), http://dx.do.org/1.11/j.asoc.1.5.

ASOC-11; o. of Pages 1 Y. Peng, B.-L. Lu / Appled Soft Computng xxx (1) xxx xxx 7 Senstvty n relaton to parameters M and Senstvty n relaton to parameters M and 1 log1(ftness Value of fun1) 5 5 5 3 1 M 1 log1(ftness Value of fun) 3 1 M 1 Senstvty n relaton to parameters M and Senstvty n relaton to parameters M and log1(ftness Value of fun) 1 M 1 log1(ftness Value of fun1) 1 M 1 Fg.. MA-HPSOL senstvty n relaton to M and. 5 x 1 5 Senstvty n relaton to samplng scale p x 1 9 Senstvty n relaton to samplng scale p 15 Ftness value of fun 3 1 Ftness value of fun1 1 5 5 1 15 5 3 35 Value of samplng scale p 5 5 1 15 5 3 35 Value of samplng scale p Fg. 9. MA-HPSOL senstvty n relaton to samplng scale p. for numercal optmzaton, Appl. Soft Comput. J. (1), http://dx.do.org/1.11/j.asoc.1.5.

ASOC-11; o. of Pages 1 Y. Peng, B.-L. Lu / Appled Soft Computng xxx (1) xxx xxx Table 1 Benchmark functons used n ths study. ton Range Characterstcs Optma D f 1(x) = =1 x [,1] Unmodal D f (x) = =1 (1(x x +1 ) + (x 1) ) [.,.] Unmodal ( ) D 1 f 3(x) = exp. D ( D ) =1 x 1 exp D ) + + e [ 3.7,3.7] Multmodal =1 D x ( ) f (x) = =1 D cos x + 1 [,] Multmodal =1 D f 5(x) = kmax =1 k=1 [ak cos(b k (x +.5))]} kmax D k=1 {ak cos(b k.5)} a =.5, b = 3, kmax = ; D [.5,.5] Multmodal f (x) = =1 {x 1 cos(x ) + 1} [ 5.1,5.1] Multmodal D f 7(x) = { =1 {y 1 cos(y ) + 1} x x <.5 y = round(x ) x.5, = 1,,..., D D [ 5.1,5.1] Multmodal f (x) = 1.99D {x sn( x.5 )} =1 [ 5,5] Multmodal f 9(x) = f 3(y), y = M x [ 3.7,3.7] Rotated f 1(x) = f (y), y = M x [,] Rotated f 11(x) = f 5(y), y = M x [.5,.5] Rotated f 1(x) = f (y), y = M x [ 5.1,5.1] Rotated f 13(x) = f 7(y), y = M x D [ 5.1,5.1] Rotated f 1(x) = 1.99D { =1 y sn( y.5 ) y 5 z =.1( y 5) y > 5, = 1,,..., D ; y = M (x.9) +.9 [ 5,5] Rotated f 15 = CF1 [ 5,5] Composton f 1 = CF [ 5,5] Composton Table Parameters settng for MA-HPSOL. Parameters Dmenson 1D 3D M, {3,3} {3,3} p 1 1 ı MAXFES 1,*1 1,*3.3. Expermental results of MA-HPSOL on test functons In ths secton, we wll gve the expermental results obtaned by MA-HPSOL n optmzng above-mentoned 1 functons wth 1 and 3 varables. Based on the parameter senstvty analyss, the parameters are set as shown n Table. Table 3 shows the Table 3 Statstcal results of MA-HPSOL n optmzng 1-D functons. tons Max Mn Mean Std f 1 f 1.35e 3 3.7e.39e 3.5e f 3 f f 5 f f 7 f.559e 7 1.91e1 1.1e.519e f 9 f 1 f 11 f 1 f 13 f 1.3173e.75e 1.77e 9.395e 9 f 15.357e 9.57e3 3.1e.379e f 1 5.13e+ 1.75e+ 3.1e+.59e+ statstcal results of MA-HPSOL n optmzng the 1 functons wth ten varables (1-D functons) based on 3 ndependent runs, whch ncludes the maxmum, mnmum, mean and standard devaton. The termnaton crteron n ths experment s to run MA-HPSOL untl the number of functon evaluatons reaches the maxmum value 1,. Obvously MA-HPSOL performs very well on most of the 1 functons. For functons 1, 3,, 5,, 7, 9, 1, 11, 1 and 13, the maxmum, mnmum and mean values of the 3 runs are all equal to the optmal values. The performance of MA-HPSOL s stable enough because the dversty s kept on a hgher level to avod premature convergence. But when solvng the functons,, 1, 15 and 1, MA-HPSOL does not get accurate optmal results. Due to the ll-condtonal nature of functon (Rosenbrock problem) and functon (Schwefel problem), whch has optmal values at (1,1,..., 1) and (.9,.9,...,.9), t s hard to adapt quckly to the dfferent optmzaton stages. Also, functon 1 s the rotatonal verson of functon, so there s some dstance between the local optmum found (1.77e 9) and the global optmum. For the two composton functons, most of the solutons are obvously worse than the optmal values. The reason for the poor performance s that both functons are more challengng problems wth a randomly located global optmum and several randomly located deep local optma. They are asymmetrcal multmodal problems, wth dfferent propertes n dfferent areas. Due to the complex shape of the composton functons, t s dffcult to get the same accurate results as the benchmark functons. However, we fnd that MA-HPSOL gets relatvely good results of these two composte functons when compared wth some state-of-the-art algorthms, whch wll be shown n the followng part. The expermental results for MA-HPSOL n optmzng the 1 functons wth 3 varables (3-D functons) are shown n Table. The maxmum number of functon evaluatons s set at 3,. for numercal optmzaton, Appl. Soft Comput. J. (1), http://dx.do.org/1.11/j.asoc.1.5.

ASOC-11; o. of Pages 1 Y. Peng, B.-L. Lu / Appled Soft Computng xxx (1) xxx xxx 9 Table Statstcal results of MA-HPSOL n optmzng 3-D functons. tons Max Mn Mean Std f 1 3.955e 31.9e 33 1.35e 319 f.39e 5 1.111e5 5.31e 1.75e 5 f 3 3.557e5 1.9e5 1.7e5 f f 5 f f 7 f.79e1 9.1e 9.19e f 9 f 1 f 11 f 1 f 13 f 1.5e1 5.93e 5.71e f 15.3e.579e.91e7.175e7 f 1 1.71e+1 1.7e+.591e+ 3.57e+ The other parameters are the same as those for 1-D functons. The statstcal results n Table are obtaned from 3 ndependent runs. As shown n Table, when solvng the functons, 5,, 7, 9, 1, 11, 1 and 13, the statstcal results ncludng the maxmum, mnmum and mean values for the 3 runs are all equal to the optmal values. The results of functons 1 and 3 are not so good as the results obtaned n the 1-D experment. The man reason accountng for ths phenomenon may be the lack of a suffcent number of partcles for explorng the feasble space. We use the same par of {M,}={3,3} for both 1-D and 3-D experments, whch means the total partcles n the populaton s 9. Ths number of partcles s proper for 1-D experments, and MA-HPSOL obtans promsng results as do some other algorthms [13]. But the landscape of test 3-D functons s so complex that t s dffcult to explore such a hgh dmensonal feasble space (D = 3) wth so few partcles. As the results of functons,, 1, 15 and 1 show, MA-HPSOL can get values whch are very close to the optma. Also, the standard are very small for all of these functons, whch means that MA-HPSOL exhbts excellent stablty over all 3 runs... Comparsons wth state-of-the-art algorthms In order to further verfy the effectveness of MA-HPSOL, we use experments to evaluate the performance of MA-HPSOL by comparng t wth fve exstng algorthms, [5], [1], FDR-PSO [1], [13] and [1]. For easy comparson wth state-ofthe-art algorthms, the populaton sze for all sx algorthms s set to 9. Any specfc parameters are set exactly the same as n the orgnal work. The termnaton crtera s to run the algorthms untl the number of functon evaluatons reaches the maxmum value 1,. All the results gven n Table 5 are based on 3 ndependent runs. Table 5 Results of sx algorthms for 1-D results. f 1 f f 3 f 1.1911e ±.e.73e±1.e.35e ±.3135e.9e ±.175e ± 9.115e ± 1.3e 1.791e+ ± 1.7e+ 1.79e ± 1.11e FDR-PSO.3711e 9 ±.e 5.317e ± 1.375e 1.1e ± 9.e5 1.1e ±.735e 1.1e1 ± 5.3e1.131e+ ± 1.7e+ 3.557e5 ±.e.59e 3 ± 5.75e 3 7.79e9 ±.97e 5.111e+ ± 1.933e+ 3.59e ±.55e.5e ±.7759e MA-HPSOL ±.39e ± 3.5e ± ± f 5 f f 7 f.e ± 1.77e 3 5.339e+ ± 3.139e+ 3.7e+ ±.95e+ 5.7e+ ±.1e+ 1.19e ±.353e 1.113e+1 ±.1e+ 1.333e+ ± 1.39e+ 9.95e+ ±.1e+ FDR-PSO.57e 3 ± 1.55e 3.715e+ ±.739e+ 1.e+ ± 1.5313e+.91e+ ± 1.995e+ ± ± ±.35e 5 ±.597e 5.99e3 ± 1.53e 1.79e+ ±.35e+ 1.7e ± 9.17e 1.5193e+3 ± 3.71e+ MA-HPSOL ± ± ± 1.1e ±.519e f 9 f 1 f 11 f 1.99e ±.e 1.75e ± 7.59e 5.51e ± 7.31e 9.19e+ ±.11e+ 1.399e+ ± 1.39e+ 1.17e ± 7.331e.373e+ ± 1.79e+ 1.5531e+1 ± 5.9e+ FDR-PSO 1.5e ± 3.99.9e 1.e ± 5.9e 3.951e ±.331e 1.177e+1 ±.999e+ 5.7e ± 1.31e 5.577e ± 1.399e 3 5.1e ±.571e.151e+ ± 1.17e+ 1.353e3 ± 5.953e3 3.55e ±.73e 9.35e ± 1.91e 1.7993e+1 ±.55e+ MA-HPSOL ± ± ± ± f 13 f 1 f 15 f 1.7333e+ ±.773e+ 7.935e+ ± 3.3e+ 1.333e+ ± 1.37e+ 1.55e+ ±.11e+ 1.3171e+1 ±.e+ 1.933e+3 ± 3.955e+.7e+1 ± 7.51e+1 1.313e+ ± 1.3e+ FDR-PSO 1.7e+1 ± 3.19e+ 1.5e+3 ± 3.7e+ 1.1333e+ ± 1.195e+ 1.1e+ ± 1.9e+ 3.993e+ ± 1.1333e+.337e+ ± 1.373e+ 9.3e+ ±.59e+1.9e+ ±.551e+ 1.5e+1 ± 5.3e+ 1.5355e+3 ± 3.39e+ 5.3e+1 ±.599e+1.1e+1 ± 1.1e+ MA-HPSOL ± 1.77e 9 ±.395e 9 3.1e ±.379e 9 3.1e+ ±.59e+ for numercal optmzaton, Appl. Soft Comput. J. (1), http://dx.do.org/1.11/j.asoc.1.5.

ASOC-11; o. of Pages 1 1 Y. Peng, B.-L. Lu / Appled Soft Computng xxx (1) xxx xxx Table Results of sx algorthms for 3-D results. f 1 f f 3 f 5.3e ±.79e3.e+1 ±.7331e+ 1.91e ± 7.93e 3.55e ± 3.13e 5.753e ± 1.37e 7 1.719e+1 ±.33e+ 9.39e+ ± 3.e+.1e ± 1.997e FDR-PSO.135e 37 ± 1.399e 3 1.31e+1 ±.e+ 1.93e ±.9e.955e ±.557e.11e ±.3e 99 1.91e+1 ± 5.9737e+ 1.1e ± 3.77e5 ± 5.5e ± 1.7e 3.e+1 ±.35e+1.391e+ ±.753e+ 3.751e+ ± 1.39e+1 MA-HPSOL 1.35e 319 ± 5.31e ± 1.75e 5 1.9e5 ± 1.7e5 ± f 5 f f 7 f.59e+ ± 1.719e+ 5.391e+1 ± 1.95e+1.77e+1 ± 1.7e+1 3.3e+3 ± 5.39e+ 1.75e+1 ± 3.713e+.7e+1 ±.e+1.e+1 ± 3.e+1 5.11e+3 ±.1e+ FDR-PSO 1.59e+ ± 1.315e+.53e+1 ± 7.95e+.7e+1 ± 1.13e+1 3.71e+3 ±.73e+ ± 9.3e ±.351e 3.5333e+ ± 1.1e+ 3.373e+ ± 1.717e+.33e+ ±.19e+ 1.33e+ ± 3.77e+1 9.33e+1 ± 3.1e+1.99e+3 ± 7.391e+ MA-HPSOL ± ± ± 9.1e ± 9.19e f 9 f 1 f 11 f 1.37e ± 7.35e.5e ±.75e.3977e+ ±.e+.9e+1 ± 1.3e+1 9.953e+ ±.55e+ 1.333e ±.71e.33e+1 ±.7319e+ 1.99e+ ±.97e+1 FDR-PSO 1.751e+ ± 5.93e 1.99e ±.33e.7e+ ± 1.791e+.91e+1 ± 1.3e+1.119e 5 ± 1.e.71e ± 1.35e 3 3.519e+ ±.973e+ 3.95e+1 ±.77e+.713e+ ± 7.337e+.973e ± 1.39e 1.e+1 ± 7.e+ 1.3e+ ± 3.171e+1 MA-HPSOL ± ± ± ± f 13 f 1 f 15 f 1 7.99e+1 ±.7e+1.9e+3 ± 7.9191e+ 3.e+1 ± 7.1e+1.1e+1 ±.957e+1 1.1e+ ±.7e+1 5.91e+3 ±.577e+ 9.3333e+1 ± 1.e+ 1.799e+ ± 1.e+ FDR-PSO.375e+1 ± 1.9e+1.e+3 ±.339e+ 3.7e+1 ± 7.9e+1.35e+1 ± 1.377e+ 3.e+1 ±.595e+.51e+3 ± 5.753e+.35e 3 ±.3791e.15e+1 ± 7.3e+1 1.11e+ ± 3.3e+1 7.799e+3 ± 7.39e+ 5.19e+1 ± 3.57e+1 3.93e+ ± 3.591e+ MA-HPSOL ± 5.93e ± 5.71e.91e7 ±.175e7.591e+ ± 3.57e+ Furthermore, a dstance functon Index(D) for descrbng the mean dstance between the optmal soluton and the obtaned best soluton s defned as follows [33]. Index(D) = f opt(d) f best (D). () D where f opt (D) and f best (D) are the optmal soluton and the obtaned best soluton, respectvely. Ths metrc s usually used to compare the decreasng veloctes of the dfferences between the solutons obtaned by all knds of evolutonary algorthms and the target soluton. In ths paper, the optma for all the test functons are and the obtaned best solutons are usually very close to, so we use the log1 (f best (D)) nstead of f best (D) for narrowng the nterval of metrc. But the abs(log) functon s not monotonc so we modfy the Index(D) to Dst(D) as follows so that we can easly vsualze the results of each algorthm. Dst(D) = (log1(f best(d)) f opt (D)) (7) D Fg. 1 presents the Dst(D) values n terms of the best ftness value of the medan run of each algorthm for each test functon (D = 1). We record the best solutons every 5 functon evaluatons for each test problem wth total functon evaluatons 1,. So the nterval for the horzontal coordnate s [1,] and the vertcal coordnate shows the Dst(D). From the results n Table 5 and Fg. 1, we observe that MA- HPSOL surpasses all other algorthms on all functons except functon. The performance of n optmzng functon s superor to MA-HPSOL. However, when we run both and MA-HPSOL 5 ndependent tmes we fnd that holds a very small probablty (3/5) to trap n local optma 11.33 and 3.77 whle MA-HPSOL stll obtan results wth precson 1. The convergence characterstc of MA-HPSOL s very promsng n optmzng unmodal, multmodal, rotated multmodal and composte problems. For, t just performs well on unmodal functon 1 and also other algorthms get good results. shows good convergence characterstcs on functons 1, 3 and 9 whle FDR- PSO has a relatvely good convergence for functons, 3 and 9. Ths s a reasonable phenomenon because functon 9 s just the rotatonal verson of functon 3. shows good convergence n optmzng functons 1, 3, 5,, 7, 9 and 1. Because partcles n MA-HPSOL are traned wth the comprehensve learnng method, MA-HPSOL and share some smlar convergence characterstcs n optmzng functons 1, 3,,, 9, 15 and 1 wth the dfference that MA-HPSOL converges faster than. Especally for functons 3,, 5,, 7, 9, 1, 11, 1 and 13, MA-HPSOL converges to the global optmum n less than, functon evaluatons on the whole. Ths s manly caused by mult-swarms n the herarchcal archtecture and the local search strategy. Table gves the means and standard devatons of the 3 runs of the sx algorthms on the sxteen test functons wth D = 3. As the convergence graphs are smlar to the 1-D problems, they are not presented here. It s a acd test for these algorthms holdng a populaton wth just 9 partcles. From the results n Table, we for numercal optmzaton, Appl. Soft Comput. J. (1), http://dx.do.org/1.11/j.asoc.1.5.

ASOC-11; o. of Pages 1 Y. Peng, B.-L. Lu / Appled Soft Computng xxx (1) xxx xxx 11 5 ton 1. ton.1 5 Index: DIST(D=1) 5 5 3 Index: DIST(D=1).1..3. 35 5 1 15.5 5 1 15. ton 3. ton.. Index: DIST(D=1).... Index: DIST(D=1)....... 5 1 15. 5 1 15. ton 5. ton Index: DIST(D=1)...... Index: DIST(D=1)...... 5 1 15. 5 1 15. ton 7. ton. Index: DIST(D=1)...... Index: DIST(D=1)..... 5 1 15 5 1 15 Fg. 1. The medan dst(d) values of 1-D test functons. for numercal optmzaton, Appl. Soft Comput. J. (1), http://dx.do.org/1.11/j.asoc.1.5.

ASOC-11; o. of Pages 1 1 Y. Peng, B.-L. Lu / Appled Soft Computng xxx (1) xxx xxx. ton 9. ton 1.. Index: DIST(D=1)... Index: DIST(D=1)........ 5 1 15. 5 1 15. ton 11. ton 1.. Index: DIST(D=1)... Index: DIST(D=1)...... 5 1 15. 5 1 15. ton 13. ton 1... Index: DIST(D=1).... Index: DIST(D=1)... 5 1 15. 5 1 15.5 ton 15.5 ton 1. Index: DIST(D=1).5 5 1 15 Index: DIST(D=1).15.1.5 5 1 15 Fg. 1. (contnued) for numercal optmzaton, Appl. Soft Comput. J. (1), http://dx.do.org/1.11/j.asoc.1.5.

ASOC-11; o. of Pages 1 Y. Peng, B.-L. Lu / Appled Soft Computng xxx (1) xxx xxx 13 Table 7 Results of data set 1 cylndrcty error evaluaton. Parameter Improved GA [37] PSO [3] MA-HPSOL x.95.3315. y.53.1.9 z.13 l.35.5.591 m.1.9.1 n.99935 1 1 Cylndrcty.1597.53.1 Fg. 11. Defnton of cylndrcty error. can observe that the performance of almost all algorthms except MA-HPSOL degrade dramatcally n optmzng hgh-dmensonal problems wth a small populaton sze. Takng for example, t can attan the precson of 1 wth populaton sze, but t only 1 3 wth populaton sze 9..5. Cylndrcty error evaluaton based on MA-HPSOL In the past few years, many knds of evolutonary algorthms have contrbuted to optmze a wde range of manufacturng process [3 3], whose demands to be more robust, more flexble, more complex are ever ncreasng. Cylndrcal features have become one of the most mportant features n mechancal desgns. They contrbute sgnfcantly to fundamental mechancal products such as transmsson systems, revolvng devces and njecton molds, to acheve the ntended functonaltes. Therefore, evaluatng cylndrcty error precsely s very mportant n hgh precson manufacturng. Many attempts have been made for evaluatng the cylndrcty error [37,3]. The defnton of cylndrcty error can be stated as follows [39]. Fg. 11 llustrates the cross secton of a cylnder wth axs drecton n(l,m,1) and radus R. The projecton of a measured pont P onto the cylnder s F. Assumng the axs passes the pont Q(x, y, ), then the axs functon can be expressed as (x x )/l = (y y )/m = z. The dstance from P ( = 1,,..., ) to the axs s j k x x y y z l m 1 e = EP = QP n = n l + m + 1, () where means the length of a vector n the Eucldean space. Mathematcally, the cylndrcty error evaluaton can be formulated as an optmzaton problem wth parameter vector (x, y, l, m). Hence, the ftness functon of evaluatng cylndrcty error under mnmum zone cylnder (MZC) crteron s amng at mnmzng the objectve functon: f (x, y, l, m) = max(e ) mn(e ). (9) Here we wll evaluate the cylndrcty error by the above proposed MA-HPSOL algorthm and related parameters are set as follows: (1) The MA-HPSOL dependent parameters are set as shown n Table ; () Dmenson of partcles s, whch s the length of parameter vector (x, y, l, m); (3) Termnal condton: maxmum generatons 1. The remanng parameters are the same as [13]. The measurement data sets are ntroduced from Refs. [3,]. All parameters are ntalzed n [,1]. The evaluatng results are gven n Tables 7 and. As shown n Tables 7 and, the proposed MA- HPSOL algorthm s a compettve approach n cylndrcty error evaluaton, whch s obvously a complcated optmzaton problem. When comparng wth other types of evolutonary algorthms Table Results of data set cylndrcty error evaluaton. Parameter Improved GA [1] PSO-DE [39] MA-HPSOL x.1153.15.19 y.79.91.911 z l.7.19.19 m.9.915.915 n 1 1 1 Cylndrcty.17.139719.13959 (Improved GA [37,1], PSO [3], PSO-DE [39]), the results obtaned by MA-HPSOL are better than that lsted n exstng lteratures. 5. Concluson and future work Ths paper presents a hgh performance memetc algorthm (MA-HPSOL) to deal wth complex numercal optmzaton problems. Wthn the framework of the proposed algorthm, there are three man components: an herarchcal partcle swarm optmzer for exploraton, a local search method based on latn hypercube samplng for explotaton and a mutaton operator usng dfferental nformaton. Concretely, the herarchcal PSO s composed of two layers: the bottom layer and the top layer. Partcles n each swarm of the bottom layer evolve ndependently, whch means each swarm s a nche wth no nfluence on other swarms. Global best poston n each swarm of the bottom layer becomes the canddate of the partcle n the top layer, so the global best poston n the swarm of the top layer steers the partcles n each swarm of the bottom layer ndrectly. The local search strategy, latn hypercube samplng, ams at explotng the best solutons found so far unformly. Both such exploraton and the explotaton operators can help keep the dversty of whole populaton on a hgher level to avod partcles trappng nto local optma. Even f partcles n one swarm are trapped n local optma, other swarms are also lkely to reach the global optma. Furthermore, a mutaton operator, amng at modfyng the partcles postons based on dfferental nformaton, s used. Accordng to the expermental results on 1 functons, the proposed memetc algorthm (MA-HPSOL) has excellent performance to fnd global optmal solutons. MA-HPSOL s used to evaluate the cylndrcty error and the expermental results show that t can obtan compettve performance as well. For our future work, two aspects, quanttatvely depctng the dversty of the whole populaton and mposng mutual communcaton among swarms n the bottom layer, wll be nvestgated n depth. Acknowledgments Ths work was partally supported by the atonal Basc Research Program of Chna (grant no. 9CB391) and the European Unon Seventh Framework Program (grant no. 719). We would lke to thank Prof. P.. Suganthan for provdng the source code of. for numercal optmzaton, Appl. Soft Comput. J. (1), http://dx.do.org/1.11/j.asoc.1.5.

ASOC-11; o. of Pages 1 1 Y. Peng, B.-L. Lu / Appled Soft Computng xxx (1) xxx xxx References [1] H. Azamathulla, F. Wu, Support vector machne approach for longtudnal dsperson coeffcents n natural streams, Appled Soft Computng 11 () (11) 9 95. [] H. Azamathulla, A. Ghan, C. Chang, Z. Hasan,. Zakara, Machne learnng approach to predct sedment load a case study, Clean-Sol, Ar, Water 3 (1) (1) 99 97. [3] J. Kennedy, R. Eberhart, Partcle swarm optmzaton, n: Proceedngs of IEEE Internatonal Conference on eural etworks, vol., 1995, pp. 19 19. [] R. Eberhart, J. Kennedy, A new optmzer usng partcle swarm theory, n: Proceedngs of the Sxth Internatonal Symposum on Mcro Machne and Human Scence, 1995, pp. 39 3. [5] Y. Sh, R. Eberhart, A modfed partcle swarm optmzer, n: Proceedngs of IEEE Internatonal Conference on Evolutonary Computaton, 199, pp. 9 73. [] J. Kennedy, Small worlds and mega-mnds: effects of neghborhood topology on partcle swarm performance, n: Proceedngs of IEEE Congress on Evolutonary Computaton, vol. 3, 1999. [7] J. Kennedy, R. Mendes, Populaton structure and partcle swarm performance, n: Proceedngs of IEEE Congress on Evolutonary Computaton, vol., IEEE,, pp. 171 17. [] P. Suganthan, Partcle swarm optmser wth neghbourhood operator, n: Proceedngs of IEEE Congress on Evolutonary Computaton, vol. 3, 1999. [9] W. Jan, Y. Xue, J. Qan, Improved partcle swarm optmzaton algorthms study based on the neghborhoods topologes, n: Proceedngs of IEEE Annual Conference of Industral Electroncs Socety, vol. 3,, pp. 19 19. [1] K. Parsopoulos, M. Vrahats, : a unfed partcle swarm optmzaton scheme, Lecture Seres on Computer and Computatonal Scences 1 () 73. [11] R. Mendes, J. Kennedy, J. eves, The fully nformed partcle swarm: smpler, maybe better, IEEE Transactons on Evolutonary Computaton (3) () 1. [1] T. Peram, K. Veeramachanen, C. Mohan, Ftness dstance-rato based partcle swarm optmzaton, n: Proceedngs of IEEE Symposum on Swarm Intellgence, 3, pp. 17 11. [13] J. Lang, A. Qn, P. Suganthan, S. Baskar, Comprehensve learnng partcle swarm optmzer for global optmzaton of multmodal functons, IEEE Transactons on Evolutonary Computaton 1 (3) () 1 95. [1] Y. Jang, T. Hu, C. Huang, X. Wu, An mproved partcle swarm optmzaton algorthm, Appled Mathematcs and Computaton 193 (1) (7) 31 39. [15] S. Yang, C. L, A clusterng partcle swarm optmzer for locatng and trackng multple optma n dynamc envronments, IEEE Transactons on Evolutonary Computaton 1 () (1) 959 97. [1] H. Wang, S. Yang, W. Ip, D. Wang, A partcle swarm optmzaton based memetc algorthm for dynamc optmzaton problems, atural Computng 9 (3) (1) 73 75. [17] H. Wang, S. Yang, W.H. Ip, D. Wang, A memetc partcle swarm optmsaton algorthm for dynamc mult-modal optmsaton problems, Internatonal Journal of Systems Scence 3 (7) (1) 1 13. [1] C. Chen, Two-layer partcle swarm optmzaton for unconstraned optmzaton problems, Appled soft computng 11 (1) (11) 95 3. [19] P. Angelne, Usng selecton to mprove partcle swarm optmzaton, n: Proceedngs of IEEE Internatonal Conference on Evolutonary Computaton, 199, pp. 9. [] M. Lovbjerg, T. Rasmussen, T. Krnk, Hybrd partcle swarm optmser wth breedng and subpopulatons, n: Proceedngs of the Thrd Genetc and Evolutonary Computaton Conference, vol. 1, Cteseer, 1, pp. 9 7. [1] V. Mranda,. Fonseca, EPSO-evolutonary partcle swarm optmzaton, a new algorthm wth applcatons n power systems, n: Transmsson and Dstrbuton Conference and Exhbton: Asa Pacfc. IEEE/PES, vol.,, pp. 75 75. [] K. Parsopoulos, M. Vrahats, On the computaton of all global mnmzers through partcle swarm optmzaton, IEEE Transactons on Evolutonary Computaton (3) () 11. [3] F. Van den Bergh, A. Engelbrecht, A cooperatve approach to partcle swarm optmzaton, IEEE Transactons on Evolutonary Computaton (3) () 5 39. [] S. Lng, H. Iu, K. Chan, H. Lam, B. Yeung, F. Leung, Hybrd partcle swarm optmzaton wth wavelet mutaton and ts ndustral applcatons, IEEE Transactons on Systems, Man, and Cybernetcs, Part B 3 (3) () 73 73. [5] X. Zhao, A perturbed partcle swarm algorthm for numercal optmzaton, Appled Soft Computng 1 (1) (1) 119 1. [] H. Gao, W. Xu, Partcle swarm algorthm wth hybrd mutaton strategy, Appled Soft Computng 11 () (11) 519 51. [7] S. Hseh, T. Sun, C. Lu, S. Tsa, Effcent populaton utlzaton strategy for partcle swarm optmzer, IEEE Transactons on Systems, Man, and Cybernetcs: Part B 39 () (9) 5. [] R. Brts, A. Engelbrecht, F. Van den Bergh, Solvng systems of unconstraned equatons usng partcle swarm optmzaton, n: Proceedngs of IEEE Internatonal Conference on Systems, Man and Cybernetcs, vol. 3,, pp. 1 17. [9]. Huy, O. Soon, L. Hot,. Krasnogor, Adaptve cellular memetc algorthms, Evolutonary Computaton 17 () (9) 31 5. [3] A. Qn, V. Huang, P. Suganthan, Dfferental evoluton algorthm wth strategy adaptaton for global numercal optmzaton, IEEE Transactons on Evolutonary Computaton 13 () (9) 39 17. [31] M. McKay, R. Beckman, W. Conover, A comparson of three methods for selectng values of nput varables n the analyss of output from a computer code, Technometrcs (1979) 39 5. [3] J. Lang, P. Suganthan, K. Deb, ovel composton test functons for numercal global optmzaton, n: Proceedngs of IEEE Symposum on Swarm Intellgence, 5, pp. 75. [33] S. Ho, L. Shu, J. Chen, Intellgent evolutonary algorthms for large parameter optmzaton problems, IEEE Transactons on Evolutonary Computaton () () 5 51. [3] K. Chan, C. Kwong, Y. Tsm, A genetc programmng based fuzzy regresson approach to modellng manufacturng processes, Internatonal Journal of Producton Research (7) (1) 197 19. [35] K. Chan, C. Kwong, Y. Tsm, Modellng and optmzaton of flud dspensng for electronc packagng usng neural fuzzy networks and genetc algorthms, Engneerng Applcatons of Artfcal Intellgence 3 (1) (1) 1. [3] K. Chan, C. Kwong, H. Jang, M. Aydn, T. Fogarty, A new orthogonal array based crossover, wth analyss of gene nteractons, for evolutonary algorthms and ts applcaton to car door desgn, Expert Systems wth Applcatons 37 (5) (1) 353 3. [37] H. Ln, Y. Peng, Evaluaton of cylndrcty error based on an mproved GA wth unform ntal populaton, n: Proceedngs of IITA Internatonal Conference on Control, Automaton and Systems Engneerng, IEEE, 9, pp. 311 31. [3] J. Mao, Y. Cao, J. Yang, Implementaton uncertanty evaluaton of cylndrcty errors based on geometrcal product specfcaton (GPS), Measurement (5) (9) 7 77. [39] X. Zhang, X. Jang, P. Scott, A relable method of mnmum zone evaluaton of cylndrcty and concty from coordnate measurement data, Precson Engneerng 35 (3) (11) 9. [] K. Carr, P. Ferrera, Verfcaton of form tolerances part II: cylndrcty and straghtness of a medan lne, Precson Engneerng 17 () (1995) 1 15. [1] X. Wen, A. Song, An mproved genetc algorthm for planar and spatal straghtness error evaluaton, Internatonal Journal of Machne Tools and Manufacture 3 (11) (3) 1157 11. for numercal optmzaton, Appl. Soft Comput. J. (1), http://dx.do.org/1.11/j.asoc.1.5.