Parameters Optimization of SVM Based on Improved FOA and Its Application in Fault Diagnosis

Similar documents
Learning the Kernel Parameters in Kernel Minimum Distance Classifier

Cluster Analysis of Electrical Behavior

The Research of Support Vector Machine in Agricultural Data Classification

Network Intrusion Detection Based on PSO-SVM

Performance Assessment and Fault Diagnosis for Hydraulic Pump Based on WPT and SOM

Determining the Optimal Bandwidth Based on Multi-criterion Fusion

Meta-heuristics for Multidimensional Knapsack Problems

Smoothing Spline ANOVA for variable screening

Support Vector Machines

Maximum Variance Combined with Adaptive Genetic Algorithm for Infrared Image Segmentation

Classification / Regression Support Vector Machines

Classifier Selection Based on Data Complexity Measures *

THE PATH PLANNING ALGORITHM AND SIMULATION FOR MOBILE ROBOT

Support Vector Machines

Edge Detection in Noisy Images Using the Support Vector Machines

Research of Neural Network Classifier Based on FCM and PSO for Breast Cancer Classification

BOOSTING CLASSIFICATION ACCURACY WITH SAMPLES CHOSEN FROM A VALIDATION SET

Quality Improvement Algorithm for Tetrahedral Mesh Based on Optimal Delaunay Triangulation

Parallelism for Nested Loops with Non-uniform and Flow Dependences

An Improved Image Segmentation Algorithm Based on the Otsu Method

EVALUATION OF THE PERFORMANCES OF ARTIFICIAL BEE COLONY AND INVASIVE WEED OPTIMIZATION ALGORITHMS ON THE MODIFIED BENCHMARK FUNCTIONS

Problem Definitions and Evaluation Criteria for Computational Expensive Optimization

A Notable Swarm Approach to Evolve Neural Network for Classification in Data Mining

Journal of Chemical and Pharmaceutical Research, 2014, 6(6): Research Article. A selective ensemble classification method on microarray data

BioTechnology. An Indian Journal FULL PAPER. Trade Science Inc.

Backpropagation: In Search of Performance Parameters

Chinese Word Segmentation based on the Improved Particle Swarm Optimization Neural Networks

CS246: Mining Massive Datasets Jure Leskovec, Stanford University

Application of Improved Fish Swarm Algorithm in Cloud Computing Resource Scheduling

The Research of Ellipse Parameter Fitting Algorithm of Ultrasonic Imaging Logging in the Casing Hole

An Optimal Algorithm for Prufer Codes *

Positive Semi-definite Programming Localization in Wireless Sensor Networks

Assignment # 2. Farrukh Jabeen Algorithms 510 Assignment #2 Due Date: June 15, 2009.

The Greedy Method. Outline and Reading. Change Money Problem. Greedy Algorithms. Applications of the Greedy Strategy. The Greedy Method Technique

Content Based Image Retrieval Using 2-D Discrete Wavelet with Texture Feature with Different Classifiers

Improvement of Spatial Resolution Using BlockMatching Based Motion Estimation and Frame. Integration

NUMERICAL SOLVING OPTIMAL CONTROL PROBLEMS BY THE METHOD OF VARIATIONS

Recommended Items Rating Prediction based on RBF Neural Network Optimized by PSO Algorithm

Skew Angle Estimation and Correction of Hand Written, Textual and Large areas of Non-Textual Document Images: A Novel Approach

A MOVING MESH APPROACH FOR SIMULATION BUDGET ALLOCATION ON CONTINUOUS DOMAINS

A Binarization Algorithm specialized on Document Images and Photos

Face Recognition Based on SVM and 2DPCA

Image Feature Selection Based on Ant Colony Optimization

The Comparison of Calibration Method of Binocular Stereo Vision System Ke Zhang a *, Zhao Gao b

A Novel Adaptive Descriptor Algorithm for Ternary Pattern Textures

Design of Structure Optimization with APDL

Incremental Learning with Support Vector Machines and Fuzzy Set Theory

CHAPTER 3 SEQUENTIAL MINIMAL OPTIMIZATION TRAINED SUPPORT VECTOR CLASSIFIER FOR CANCER PREDICTION

CLASSIFICATION OF ULTRASONIC SIGNALS

Clustering Algorithm Combining CPSO with K-Means Chunqin Gu 1, a, Qian Tao 2, b

TECHNIQUE OF FORMATION HOMOGENEOUS SAMPLE SAME OBJECTS. Muradaliyev A.Z.

Programming in Fortran 90 : 2017/2018

Tsinghua University at TAC 2009: Summarizing Multi-documents by Information Distance

Classifier Swarms for Human Detection in Infrared Imagery

Open Access Recognition of Oil Shale Based on LIBSVM Optimized by Modified Genetic Algorithm

Optimizing Document Scoring for Query Retrieval

Using Neural Networks and Support Vector Machines in Data Mining

Research and Application of Fingerprint Recognition Based on MATLAB

A New Approach For the Ranking of Fuzzy Sets With Different Heights

A Two-Stage Algorithm for Data Clustering

An Image Fusion Approach Based on Segmentation Region

An Influence of the Noise on the Imaging Algorithm in the Electrical Impedance Tomography *

Measure optimized cost-sensitive neural network ensemble for multiclass imbalance data learning

A Modified Median Filter for the Removal of Impulse Noise Based on the Support Vector Machines

A New Feature of Uniformity of Image Texture Directions Coinciding with the Human Eyes Perception 1

Subspace clustering. Clustering. Fundamental to all clustering techniques is the choice of distance measure between data points;

FEATURE EXTRACTION. Dr. K.Vijayarekha. Associate Dean School of Electrical and Electronics Engineering SASTRA University, Thanjavur

Fast Computation of Shortest Path for Visiting Segments in the Plane

Data Mining: Model Evaluation

Load Balancing for Hex-Cell Interconnection Network

Face Recognition Method Based on Within-class Clustering SVM

Classifier Ensemble Design using Artificial Bee Colony based Feature Selection

Solving two-person zero-sum game by Matlab

An Iterative Solution Approach to Process Plant Layout using Mixed Integer Optimisation

Available online at Available online at Advanced in Control Engineering and Information Science

Discriminative Dictionary Learning with Pairwise Constraints

Support Vector Machines. CS534 - Machine Learning

BIN XIA et al: AN IMPROVED K-MEANS ALGORITHM BASED ON CLOUD PLATFORM FOR DATA MINING

CHAPTER 4 OPTIMIZATION TECHNIQUES

Outline. Discriminative classifiers for image recognition. Where in the World? A nearest neighbor recognition example 4/14/2011. CS 376 Lecture 22 1

A Clustering Algorithm Solution to the Collaborative Filtering

Fault Diagnosis of Sucker-Rod Pumping System Using Support Vector Machine

Review of approximation techniques

A mathematical programming approach to the analysis, design and scheduling of offshore oilfields

Classifying Acoustic Transient Signals Using Artificial Intelligence

Kent State University CS 4/ Design and Analysis of Algorithms. Dept. of Math & Computer Science LECT-16. Dynamic Programming

Virtual Machine Migration based on Trust Measurement of Computer Node

Modular PCA Face Recognition Based on Weighted Average

Spam Filtering Based on Support Vector Machines with Taguchi Method for Parameter Selection

CHAPTER 2 PROPOSED IMPROVED PARTICLE SWARM OPTIMIZATION

Term Weighting Classification System Using the Chi-square Statistic for the Classification Subtask at NTCIR-6 Patent Retrieval Task

Collaboratively Regularized Nearest Points for Set Based Recognition

Machine Learning 9. week

Optimal Design of Nonlinear Fuzzy Model by Means of Independent Fuzzy Scatter Partition

Multi-objective Optimization Using Self-adaptive Differential Evolution Algorithm

A high precision collaborative vision measurement of gear chamfering profile

A Statistical Model Selection Strategy Applied to Neural Networks

Learning-Based Top-N Selection Query Evaluation over Relational Databases

Image Emotional Semantic Retrieval Based on ELM

The Study of Remote Sensing Image Classification Based on Support Vector Machine

Transcription:

Parameters Optmzaton of SVM Based on Improved FOA and Its Applcaton n Fault Dagnoss Qantu Zhang1*, Lqng Fang1, Sca Su, Yan Lv1 1 Frst Department, Mechancal Engneerng College, Shjazhuang, Hebe Provnce, P. R. Chna. Fourth Department, Mechancal Engneerng College, Shjazhuang, Hebe Provnce, P. R. Chna. * Correspondng author. Tel.: +86186311981; emal: qantuz@sna.com Manuscrpt submtted July 7, 15; accepted September 8, 15. do: 1.1776/jsw.1.11.131-139 Abstract: In most cases, fault dagnoss s essentally a pattern recognton problem and support vector machne (SVM) provdes a new soluton for the dagnoss problem of systems n whch the fault samples are few. However, the parameters selecton n SVM has sgnfcant nfluence on the dagnoss performance. In ths paper, mproved frut fly optmzaton algorthm (IFOA), whch s bascally the standard frut fly optmzaton algorthm (FOA) combned wth Levy flght search strategy, s proposed to determne the SVM parameters. Some benchmark datasets are used to evaluate the proposed algorthm. Furthermore, the proposed method s used to dagnose the faults of hydraulc pump. Experments and engneerng applcaton show that the proposed method outperforms standard FOA, genetc algorthm (GA) and partcle swarm optmzaton (PSO) methods. Key words: SVM, frut fly optmzaton algorthm, Levy flght, fault dagnoss. 1. Introducton Wth the development of artfcal ntellgence, many methods have been appled to fault dagnoss, such as expert systems [1], fuzzy logc [], neural network [3], and successful results have been obtan. However, n practcal engneerng applcaton, due to the complex system wth mult-factor, strong couplng and ntensve nonlnearty, t s dffcult to obtan much useful fault nformaton whch are necessary for the above methods. Support vector machne as a new machne learnng method was wdely used n fault dagnoss due to ts strong classfcaton capablty and good generalzaton even though the fault samples are few [4]-[6]. But there s an exstng fact that the kernel functon parameters and the penalty factor of SVM affect ts classfcaton performance serously and the parameters are dffcult to select due to the lack of correspondng theoretcal bass. In order to fnd the approprate parameters of SVM, many ntellgent evolutonary algorthm, such as genetc algorthm (GA)[7], partcle swarm optmzaton (PSO)[8], ant colony optmzaton algorthm (ACO)[9], and artfcal bee colony optmzaton (ABC)[1], have been carred out on SVM parameters optmzaton, and mproved the performance of SVM to some extent. However, due to the defects of algorthm tself, those methods are stll tme consumng and does not performed very well at many stuatons. The frut fly optmzaton algorthm (FOA), whch was ntroduced by Pan[11], s a novel global optmzaton computatonal method whch was nspred by the foragng behavor of frut fles. The FOA has few parameters to adjust, and t s easy to understand and more sutable for computer programmng 131 Volume 1, Number 11, November 15

processng. In ths paper, amng at the problems that the FOA can easly fall nto the local optmum n global optmzaton computatons and the stablty of the algorthm do not performs well, an mproved FOA (IFOA) was proposed and was used to optmze SVM parameters. In IFOA, by dvdng the frut fly group nto two subgroups, ntroducng Levy flght search strategy, and changng the locaton updatng way of each frut fly, not only the dversty of the frut fly group guaranteed, but also the global and local search ablty balanced, at the same tme, the frut fly group can also escape from local optmum thanks to the occasonal long jumps of Levy flght. The experment results on fault data of hydraulc pump ndcate that the proposed can reduce the tranng tme whle ncreasng dagnoss accuracy, and has certan superorty when compared wth the method of SVM wth FOA, GA, PSO separately.. Prncple of Support Vector Machne The man purpose of SVM s to construct optmal separatng hyper-plane and maps the tranng samples from the nput space nto a hgher dmensonal feature space va a kernel functon. Suppose there s a tranng set T x1, y1, n, xl, yl, x R, y 1, 1, where x s the nput vector and y s the label of the l x, and l s the number of the nput vectors and n s the number of nput dmenson. Structure hyper-plane s x b, where s a weght vector and b s a scalar. The goal of maxmzng the margn wdth s equvalent to the followng optmzaton problem: n 1 mn C 1 s.t. y x b 1, 1,, (1),l where C s the penalty parameter and C, are postve slack varables whch are necessary to allow msclassfcaton. Accordng to the Lagrangan prncple, the above optmzaton problem can be transformed nto ts dual form as follows: l 1 l l mn j y y j K x x 1 j 1 1 l s.t. y, C, 1,, l 1 () where K x, x x, y s the kernel functon, whch maps the sample space nto a hgh dmenson feature space. After solvng the above problem, the correspondng decson functon can be shown as: n f x sgn y K x, x b 1 (3) The typcal kernel functons are polynomal kernel functon, radal bass kernel functon and sgmod kernel functon. In the above kernel functons, the radal bass kernel functon (RBF) s wdely used due to ts unversal adaptablty and good performance. The form of RBF s as follow: K x, x exp g x x (4) where g denotes the wdth of RBF kernel functon. From (), (3) and (4), we can see that the parameters that need to be optmzed n RBF-SVM s penalty parameter C and kernel parameter g. The penalty parameter C s used to regulate the proporton between the confdence range and the emprcal rsk n 13 Volume 1, Number 11, November 15

determnate feature space to maxmze the generalzaton ablty of SVM. The kernel parameter g manly affects the dstrbuton complexty of samples n the hgh-dmenson feature space. The performance of SVM s hghly depends on the two parameters and t s of great mportance to use optmzaton algorthm whch can gve useful gudance on ts selecton. 3. Parameters Optmzaton of SVM wth IFOA 3.1. FOA Algorthm The frut fly optmzaton algorthm proposed by Pan n 11 s a new global optmzaton algorthm nspred by the food search behavors of frut fly. The olfactory organ of the frut fles can collect scents floatng n the ar, and then, after t gets close to the food locaton, t can also use ts senstve vson to fnd the food and fly to that drecton. The man steps of FOA are as follow[11]: Step 1. Randomly ntalze frut fly group locaton: Int X _ axs, Int Y _ axs Step. Gve the random drecton and dstance for food search: X X _ axs RandomValue Y Y _ axs RandomValue (5) Step 3. As the food locaton cannot be known, the dstance to the orgn s thus estmated frst (Dst), then the smell concentraton judgment value (S) s calculated, and ths value s the recprocal of dstance: Dst X Y S 1/ Dst (6) Step 4. Substtute smell concentraton judgment value (S) nto smell concentraton judgment functon (or called Ftness functon) so as to fnd the smell concentraton (Smell) of the ndvdual locaton of the frut fly: Smell Functon S (7) Step 5. Fnd out the frut fly wth maxmal smell concentraton (fndng the maxmal value or the mnmum value) among the frut fly group: bestsmell, bestindex max Smell (8) Step 6. Keep the best smell concentraton value and coordnate (X,Y). The frut fly group wll use vson to fly towards that locaton: Smellbest bestsmell X _ axs X bestindex Y _ axs Y bestindex (9) Step 7. Repeat the mplementaton of steps ~5, then judge f the smell concentraton s superor to the prevous teratve smell concentraton, f so, mplement step 6. 3.. Desgn of IFOA From the basc steps of the FOA, we can see that the frut fly group only study the best ndvdual n the whole teraton process, when the best ndvdual are dscovered, all the frut fles wll fly to that locaton. Ths knd of locaton updatng way not only decreases dversty of the frut fly group, but also makes algorthm easly trapped n local optmum and affects ts convergence speed and precson when that 133 Volume 1, Number 11, November 15

ndvdual s not the best. Socology experence tells us that the global optmum always exsts around the local optmum and the evolutonary rate of the group depends on ts worse ndvduals to a greater extent nstead of the better ndvduals. On the other hand, t s reported that many anmals and nsects, such as frut fly and honey bee, wll take a Levy flght search strategy n whch exploratory short jumps alternate wth occasonal long jumps when they are searchng food. The exploratory short jumps can ensure they have a fnely local search n ther landscape, and the occasonally long jumps can help them get nto another area to make more extensve search. Consderng the merts of Levy flght, such strategy has been appled to optmzaton and optmal search, and prelmnary results have shown ts promsng capablty[1][13]. Accordng to the above analyss, an mproved FOA was proposed. The IFOA s based on the basc FOA. In the whole evaluaton process, the dstance (D_best and D_worst) of each frut fly ndvdual to the best ndvdual and worst ndvdual n contemporary group are calculated by usng (1): Dst X X b Y Yb _ best Dst _ worst X X w Y Yw (1) where X, Y are the locaton of frut fly ndvdual, X b, Yb and X w, Yw are the locaton of the best ndvdual and worst ndvdual n the contemporary group. If D_best > D_worst, dvde the ndvdual nto drawback subgroup, else, dvde the ndvdual nto advanced subgroup (The frut fly ndvdual and ndvdual number n two subgroups are exchangng n teraton process). As for drawback subgroup, a global search s made wth the gudance of the best ndvdual, frut fles updatng locaton accordng to (11). As for advanced subgroup, the Levy flght strategy was ntroduced. A local search s made around the best ndvdual, frut fles updatng locaton accordng to (1). X X RandomValue b Y Yb RandomValue (11) X X X X L b Y Y Y Yb L (1) where X, Y are the new locaton of each frut fles, L s the random search pathway of Levy flght and s the step sze whch should be related to the scale of the partcular problem under study. The symbol means entrywse multplcatons. Frut fles n each subgroup update ts locaton by usng (11) and (1) respectvely, dsplaced the locaton updatng way that all the frut fles fly to the best locaton. The new locaton updatng way not only prevent loss of dversty, but also balanced the global search ablty and local search ablty. Meanwhle, snce the sze and drecton of L are hghly random, ts occasonal long jumps can make the frut fles jump from one area to another, whch can ensure the frut fles avod beng attracted by the local optmum when makng local search. In (1), the L s calculated by usng Mantegna s algorthm[14]: s / v 1/ (13) where s s the random search pathway L, the scope of s, and v are drawn from normal 134 Volume 1, Number 11, November 15

dstrbutons. That s ~ N,, ~ N, v (14) wth 1/ 1 sn / 1 / 1 /, v 1 (15) where s standard Gamma functon. 3.3. Parameters optmzaton of SVM wth IFOA SVM classfcaton model constructed by radal bass kernel functon only has two parameters need to be optmzed, whch are penalty parameter C and the kernel functon parameters g. In ths paper, the IFOA method s employed to determne the parameters of SVM. Here the frut fly s composed of the parameter C and g, and the process of optmzng the SVM parameters wth IFOA can been descrbed as follows: Step1. Randomly generate the ntal locaton of the each frut fly whch determnes the scope of SVM parameter vector array ( C, g ). Set the number of frut fles, the maxmum teraton number and step sze. Set teraton varable: t and perform the tranng process from Step~Step6. Step. Set teraton varable: t t 1. Step3. Evaluate the qualty of every frut fly by usng ftness functon whch was calculated wth fve-fold cross valdaton method. Step4. Updatng the best frut fly and global best frut fly accordng to the ftness value. Then dvded the group nto two subgroups accordng to (1) and update the locaton of the frut fles n each subgroups accordng to (11) and (1). Step5. Go to Step6 f the stoppng crtera s satsfed. Otherwse, go to Step to contnue the operaton. Step6. End the tranng procedure and now the parameter C and g gotten s the fnal model parameters. 3.4. Verfcaton of IFOA-SVM Algorthm To evaluate the performance of the proposed IFOA-SVM method, we have used three common benchmark data sets from UCI benchmark[14], the Glass, Segment and German datasets. Instructon of the three datasets can be found n Table 1. Table 1. Instructon of the UCI Datasets Name Attrbuton Class Tranng set Testng set German Segment Glass 4 18 9 7 6 7 17 8 161 17 In the experment, the proposed IFOA method and other three classcal optmzaton algorthm,.e., tradtonal FOA, GA and PSO are used to optmze SVM and classfy these data sets. To make a far comparson, the same parameter value was used, n addton to some new parameters. The populaton sze for the four methods s, and maxmum evaluaton generaton s set to 1. For the PSO, the parameters W.75, c1 c 1.5. For the GA, the mutaton probablty pm.1 and the crossover probablty pc.7. In IFOA, the step sze.5. In Fg. 1, we have plotted the ftness curve of the three data sets wth four methods. From the ftness curve that shown n Fg. 1, t can be clearly seen that FOA, GA and PSO may ether fall nto a local optmum 135 Volume 1, Number 11, November 15

easly, or have a rather slow evoluton speed, whle IFOA s bult wth good global search ablty and fast convergence. The test results of the three data sets are shown n Table. It can be well seen that the IFOA-SVM has hgher recognton accuracy than that of the FOA-SVM, the GA-SVM and the PSO-SVM. It ndcates that IFOA s superor to FOA, GA and PSO n SVM parameters optmzaton. German 8 66 65 81 Ftness/% Ftness/% 67 Glass 81 8 69 68 Segment IFOA FOA PSO GA IFOA FOA PSO GA 8 Ftness/% 7 79 IFOA FOA PSO GA 78 77 79 76 5 Iteraton number 1 78 5 Iteraton number 75 1 5 Iteraton number 1 Fg. 1. Ftness curve for three datasets wth the four methods. Table. Testng Results of the Three Datasets (%) Datasets IFOA-SVM FOA-SVM GA-SVM PSO-SVM German Segment Glass 7.88 8.687 67.4418 7.65 8.5466 64.8598 71.875 8.1739 6.155 7.5 8.444 64.341 4. Applcaton n Fault Dagnoss of Hydraulc Pump F1 N A/(m/s) A/(m/s) Hydraulc pump are mportant and frequently used n hydraulc system, whch are found wdely used n ndustral applcaton. Any defects occur to one of ts operaton may lend to serous damage to the whole hydraulc system. Therefore, t s essental to develop a relable condton montorng and fault dagnoss method to prevent t from malfuncton. The use of vbraton sgnals s qute common n the feld of condton montorng and dagnostcs of hydraulc pump damage detecton. It s possble to obtan dagnoss nformaton from the vbraton sgnals by usng some sgnal processng technques. Then, feature extracton and selecton are undertaken. At last, the pattern recognton methods are used to dagnose the faults. In ths secton, the proposed method s appled n fault dagnoss of hydraulc pump. -.5.1 - -5.5.1 F3 5 A(m/s) A/(m/s) F.5.1 t/s -.5.1 t/s Fg.. Tme doman waveform of four fault type. 4.1. Data Collecton 136 Volume 1, Number 11, November 15

In ths paper, the test object s a swash-plate axal pston hydraulc pump n the type of SY-1MCY14-1EL, whch ncludes seven pstons and the rated speed s 15rpm. A pezoelectrc accelerometer whch attached to the end cap was used to collected vbraton sgnal. Four dfferent fault type: normal (N), one loose slpper (F1), two loose slpper (F), slpper wear (F3) were obtaned. The samplng frequency s 5kHz. 1 samples of each fault type are acqured and each sample contan 48 data ponts. Therefore, 4 samples n total are collected for further analyss. Fg. demonstrates the tme doman waveforms of four fault type. From Fg. t may be possble dffer from dfferent fault type, but t s unrelable to make decson only accordng to the tme doman waveforms and further analyss s needed. 4.. Feature Extracton The autoregressve (AR) model [15] s an effectve approach to extract the fault feature of vbraton sgnal but t can only be appled to statonary sgnals, whle the fault vbraton sgnals of hydraulc pump are non-statonary. The local characterstc-scale decomposton (LCD) [16] method s a new sgnal analyss method whch can decompose the non-statonary vbraton sgnal nto several ntrnsc scale components (ISC) whch are statonary. In ths paper, the nput feature of SVM s obtaned by a novel hybrd method based on LCD and AR model. The four knds of hydraulc pump fault sgnals are orgnal decomposed nto several ISCs, then the AR model of the frst three ISCs of each vbraton sgnal whch contan man fault nformaton are establshed and the order of the AR model s determned as eght by AIC crteron. The eght AR parameters and the varance of the AR model of each ISC are regard as the feature vectors. Therefore, dmenson of the feature vector s 7 and the sze of feature matrx s 4 7. 4.3. Fault Dagnoss In ths study, to verfy the performance of IFOA-SVM n fault dagnoss, a far comparson s made wth FOA-SVM, GA-SVM, and PSO-SVM. All 4 data samples are dvded nto two data sets: the tranng set and the testng set, n whch the tranng set ncludng 4 samples are used to calculate the ftness and tran the dagnoss model, and the testng set ncludng 16 samples are used to examne the classfcaton accuracy of each model. 1 Correct Classfcaton Number 4 Ftness/% 95 IFOA FOA PSO GA 9 85 8 4 6 Iteraton number 8 IFOA-SVM FOA-SVM PSO-SVM GA-SVM 1 1 Fg. 3. Ftness optmzaton search curve. 3 1 3 4 Fault Type (Left to Roght: N, F1, F, F3) Fg. 4. Vsual expresson of the classfcaton results. In the optmzed SVM, the parameters C and g are optmzed by IFOA, FOA, GA and PSO respectvely, and the adjusted parameters wth maxmal cross valdaton accuracy are selected as the most approprate parameters. Then, the optmal parameters are utlzed to tran SVM model. The search span of parameter C and g s [, 1]. The populaton sze for the four methods s, and maxmum evaluaton generaton s set to 1. The remanng parameters needed n the four methods s the same as n Secton 3.4. 137 Volume 1, Number 11, November 15

Table 3. Dagnostc Results of Dfferent Methods Method Values of C Values of g Classfcaton Accuracy/% Tranng Tme/s IFOA-SVM FOA-SVM GA-SVM PSO-SVM 198.4376 43.9734 187.6431 1.4944.496.6596.1419.778 96.5 9.5 9.65 93.15 4.319 4.357 88.3441 17.874 The ftness curve of the four methods of the tranng set s shown n Fg. 3. From t we can see that the hghest fve-fold cross valdaton s 98.75%, whch was acheved by IFOA-SVM wth only 16 teratons. Faster convergence and hgher classfcaton accuracy can be obtaned when compared wth other three methods. Table 3 gven the dagnostc results by dfferent methods and Fg. 4 shows the vsual expresson. As can be seen from Table 3 and Fg. 4, the values of C and g obtaned by each method are dfferent. The classfcaton accuracy of the proposed IFOA-SVM s 96.5%, whch has a 3.15% mprovement than PSO-SVM whch has hghest accuracy n the other three methods. Furthermore, although the computaton tme of the proposed method s a bt more than that of the FOA-SVM, but t s much less than that of the GA-SVM and the PSO-SVM method. The above analyss ndcates that the proposed method are more sutable n fault dagnoss of hydraulc pump. 5. Concluson In ths paper, a fault dagnoss method usng SVM wth mproved FOA was proposed. The mproved FOA s bascally the standard FOA combned wth Levy flght search strategy, and was used to optmze the SVM parameters. Then the IFOA-SVM was appled to fault dagnoss of hydraulc pump. The experment results of hydraulc pump fault dagnoss ndcated that the IFOA s feasble to optmze the SVM parameters and IFOA-SVM reduces the tranng tme when compared wth GA-SVM and PSO-SVM method whle mprovng the fault dagnoss accuracy when compared wth FOA-SVM, GA-SVM and PSO-SVM method. Acknowledgment Ths work s supported by the Natonal Nature Scence Foundaton of Chna (No.517554). The authors would lke to thanks UCI benchmark for the datasets provded. References [1] Lu, S. C., & Lu, S. Y. (3). An effcent expert system for machne fault dagnoss. Internatonal Journal of Advanced Manufacture Technology, 1, 691 698. [] Tan, D. M., & Qu, W. L. (11). Fnte element model updatng of structure based on lftng wavelet packet transform and fuzzy pattern recongnton. Journal of Vbraton and Shock, 3(8), 194-198. [3] Lu, X. J., & Lu, X. M. (11). Modfed PSO-based artfcal neural network for power electronc devces fault dagnoss modelng. Energy Proceda, 13, 748-75. [4] Chao, J. H. (13). Fault dagnoss for electrcal control system by support vector machne and chaotc partcle swarm optmzaton algorthm. Journal of Computatonal Informaton System, 9(1), 4931-4938. [5] Rond, Y., Dan, P., Feng, H., & Mn, H (14). Fault dagnoss of engne by support vector machne and mproved partcle swarm optmzaton algorthm. Journal of Informaton & Computatonal Scence, 11(13), 487-4835. [6] Jng Ba, Lhong Yang & Xueyng Zhang (13). Parameter optmzaton and applcaton of support vector machne based on parallel artfcal fsh swarm algorthm., 8(3), 673-679. [7] Huang, C. L., & Wang, C. J. (6). A GA-based feature selecton and parameters optmzaton for support vector machne. Expert Systems wth Applcaton, 31, 31-4. 138 Volume 1, Number 11, November 15

[8] Ln, S. W., Yng, K. C., Chen, S. C., & Lee, Z. J. (8). Partcle swarm optmzaton for parameter determnaton and feature selecton of support vector machne. Expert Systems wth Applcaton, 35, 1817-184. [9] Zhang, X. L., Chen, X. F., & He, Z. J. (1). An ACO-based algorthm for parameter optmzaton of support vector machne. Expert Systems wth Applcaton, 37, 6618-668. [1] Sulaman, M. H., Mustafa, M. W., Shareef, H., & Khald, S. N. A. (1). An applcaton of artfcal bee colony algorthm wth least squares support vector machne for real and reactve power tracng n deregulated power system. Internatonal Journal of Electrcal Power & Energy Systems, 37, 67-77. [11] Pan, W. T. (1). A new frut fly optmzaton algorthm: Takng the fnancal dstress model as an example. Knowledge-Based Systems, 6, 69-74. [1] Yahya, M., & Saka, M. P. (14). Constructon ste layout plannng sung mult-objectve artfcal bee colony algorthm wth Levy flghts. Automaton n Constructon, 38, 14-9. [13] Dogan, E. (14). Solvng desgn optmzaton problems va huntng search algorthm wth levy flghts. Structural Engneerng and Mechancs, 5, 351-368. [14] Mantegna, R. N. (199). Fast, accurate algorthm for numercal smulaton of levy stable stochastc process. Physcal Revew E, 49, 451-458. [15] Frank, A., & Asuncon, A. UCI machne learnng repostory. Retreved, from http://archve.cs.uc. edu/ml/datasets.html [16] Cheng, J. S., Yang, Y., & Yang, Y. (1). Local characterstc-scale decomposton method and ts applcaton to gear fault dagnoss. Journal of Mechancal Engneerng, 48(9), 64-71. [17] Cheng, J. S., Yu, D. J., & Yang, Y. (6). A fault dagnoss approach for roller bearngs based on EMD method and AR model. Mechancal Systems and Sgnal Processng, (), 35-36. Qantu Zhang was born n Chongqng, P. R. Chna n 1991. He s a postgraduate student at the Frst Department, Mechancal Engneerng College, P. R. Chna. He receved hs bachelor degree from Mechancal Engneerng College n 13. Hs research nterests nclude ntellgence fault dagnoss, sgnal process and support vector machne. Lqng Fang was born n Shjazhuang, Hebe provnce, P. R. Chna n 1969. He s a professor at the Frst Department, Mechancal Engneerng College, PR Chna. He receved hs Ph.D from Bejng Insttute of Technology n 5. Hs currently research nclude ntellgence fault dagnoss and test technology. He s the author or coauthor of more than 5 scentfc papers. 139 Volume 1, Number 11, November 15