A Fast Wrapper Feature Subset Selection Method Based On Binary Particle Swarm Optimization

Size: px
Start display at page:

Download "A Fast Wrapper Feature Subset Selection Method Based On Binary Particle Swarm Optimization"

Transcription

1 2013 IEEE Congress on Evolutionary Computation June 20-23, Cancún, México A Fast Wrapper Feature Subset Selection Method Based On Binary Particle Swarm Optimization Xing Liu State Key Laboratory of Novel Software Technology Department of Computer Science and Technology Nanjing University, Nanjing , China liu.xing@outlook.com Lin Shang State Key Laboratory of Novel Software Technology Department of Computer Science and Technology Nanjing University, Nanjing , China shanglin@nju.edu.cn Abstract Although many particle swarm optimization (PSO) based feature subset selection methods have been proposed, most of them seem to ignore the difference of feature subset selection problems and other optimization problems. We analyze the search process of a PSO based wrapper feature subset selection algorithm and find that characteristics of feature subset selection can be used to optimize this process. We compare wrapper and filter ways of evaluating features and define the domain knowledge of feature subset selection problems and we propose a fast wrapper feature subset selection algorithm based on PSO employed the domain knowledge of feature subset selection problems. Experimental results show that our method can work well, and the new algorithm can improve both the running time and the classification accuracy. I. INTRODUCTION In classification problems, a learning algorithm is typically given a dataset containing many training instances, where each instance has a class label indicating which group it belongs to and a vector of feature values. The goal of the learning algorithm is to build a classifier which can be used to classify unseen instances. Nowadays, datasets usually have a large number of instances and those instances also have a large number of features, which have brought great challenges to learning algorithms. The challenge can be seen in two ways, first, a large dataset requires longer computing time and more computing resources which may not be affordable in real world applications; second, a large number of features usually means more redundant features as well as noise features, both of which have negative effects on classifier s classification accuracy [1]. Feature subset selection is meant to deal with this problem. The objective of feature subset selection is to select a minimal subset of features upon which a learning algorithm can build a classifier and achieve satisfying classification accuracy. Researchers have done much in this area and many methods have been proposed [2]. Usually, feature subset selection methods can be divided into two main categories, filters and wrappers. Both filter methods and wrapper methods can be seen as a search method, which means that filters and wrappers search in all possible subsets of the whole feature set and choose one as the result according to some criteria, and their difference lies in the evaluation criteria they use [3]. Wrappers use a learning algorithm to conduct the search, and the classification accuracy of the produced classifier is used as the selecting criteria. Filters are independent of any learning algorithms, instead, they focus on the intrinsic properties of the data and use measurements like information gain, distance, and dependence as the selecting criteria [2]. Currently, both filters and wrappers have the problem of falling into local optima because exhaustive search is not affordable in most cases, so they have to choose heuristic search strategy. However heuristic search strategy tends to fall into local best easily. In order to get better performance, feature subset selection methods need more powerful search engines. Particle swarm optimization (PSO) is a population based optimization technique proposed by Kennedy and Eberhart in 1995[4]. PSO simulates the social behavior of bird flocking or fish schooling. Each particle represents a candidate solution, and in each iteration, particles update their position in the search space according to their neighbors to find the optimal solution. PSO is fast, cheap and can usually obtain good result, which makes it widely applied into many areas including feature subset selection. Many PSO based feature subset selection methods have been proposed [5] and they have achieved good results. However, most of the proposed methods use PSO as a search engine directly ignoring special characteristics of feature subset selection problems. With the goal of maximizing classification accuracy, we choose to build a PSO based feature subset selection algorithm in a wrapper way. In each iteration, each particle needs to run a learning algorithm, which will take a long time. Existing PSO based wrapper methods all have this problem while little attention has been paid to it. By analyzing the search process of a PSO based wrapper feature subset selection algorithm, we find that the search process can be optimized by using domain knowledge of feature subset selection problems, and based on this we propose a fast wrapper feature subset selection algorithm based on PSO. Experimental results show that it can improve both the running time and the classification accuracy. The remainder of this paper is organized as follows. In section II, we review feature subset selection methods, PSO, and existing feature subset selection methods based on PSO. In section III, we propose a binary PSO (BPSO) based wrapper feature subset selection algorithm BPSOWFSS following a two-step procedure commonly used by existing methods. Then we analyze the search process of BPSOWFSS, and propose a new algorithm named Fast BPSOWFSS which speeds up the search process by bringing in domain knowledge of feature subset selection problems in section IV. Section V includes experiments design, results and analyses of the results. Finally, /13/$ IEEE 3347

2 we conclude our work and discuss possible future work in section VI. II. A. Feature subset selection BACKGROUND Feature subset selection is a long existing technique to deal with problems brought by too many features [1]. A feature subset selection method is usually made up of two parts: a feature subset generator and an evaluator. A feature subset generator searches in the state space and generates feature subsets, an evaluator evaluates these subsets generated and determine how good they are. The two parts work together to find the feature subset which meets evaluation criteria best. The state space is made up of all subsets of the whole feature set, which is illustrated in Fig 1 [1]. Fig. 1. The state space is made up of all subsets of the whole feature set Feature subset generator can also be seen as a search engine, which can be divided into three categories: exhaustive search engine, heuristic search engine and nondeterministic search engine [1]. These search engines search in the state space using different search strategies. Evaluators can be divided into two categories considering whether a learning algorithm is involved in or not [3]. Wrapper methods use a specific learning algorithm s classification accuracy to evaluate how good a subset is while filter methods use other criteria to evaluate how good a subset is. Commonly used evaluation criteria in filters are consistency measure, correlation measure and etc. [2]. The typical goal of a classification algorithm is to maximize its classification accuracy and wrapper methods can usually achieve higher classification accuracy than filter methods, because filter methods can t deal with the bias of a learning algorithm [3]. However, the cost is that wrapper methods are usually slower than filter methods. Many classical feature subset selection methods have been proposed, and most of them can be seen as a kind of combination of a feature subset generator and an evaluator. FOCUS [6] can be seen as an exhaustive search engine plus a filter evaluator which uses consistency measure to evaluate a feature subset. Relief [7] is another classical algorithm proposed by Kira and Rendcll, Relief is a little different because it doesn t have a feature subset generator. The underlying idea of Relief is that relevant features are those whose values can distinguish among instances that are near each other. Yu and Liu [8] proposed a fast correlation-based filter solution, which defines the concept of predominant correlation to do feature subset selection. More feature subset selection methods can be seen in [2]. B. Particle swarm optimization Particle swarm optimization (PSO) is a population based stochastic optimization technique developed by Kennedy and Eberhart in 1995[4]. Since then, many researchers have paid much attention to PSO and many variations were proposed. However, most of these variations share the same scheme like the original one: a particle represents a candidate solution which can be seen as a position in the search space and in each iteration the particle updates its position according to its own knowledge and the swarm s knowledge to search for the optimal solution. The process can be described formally as follows. Each particle Pi has a vector X i = (x 1,x 2,...,x n ) which represents its position in a n-dimension search space and a vector V i = (v 1,v 2,...,v n ) representing the particle s velocity. Besides, every particle knows the best solution found by itself so far, which is called pbest, and the best solution found so far by the swarm which is known as gbest. In each iteration, X i and V i is updated as follows. V t+1 i = w Vi t +c 1 rand ( pbest Xi t ) + c 2 rand ( gbest Xi t ) X t+1 i (1) = X t i +V t i (2) Where w is the so called inertia weight, c 1 and c 2 are two acceleration coefficients, rand generates random numbers, Vi t represents the value of V i in tth iteration, Xi t represents the value of X i in t th iteration. PSO was proposed to solve continuous optimization problems, in 1997, Kennedy developed binary PSO (BPSO) which can be used in feature subset selection problems [9]. In BPSO, a particle s position X is described as a binary string vector like The main idea of BPSO is still the same as original PSO and the update equation of V i remains the same. As the position s representation is changed, the update equation of X i is newly defined as follows: { 1, rand < S(vid ) x id = (3) 0, otherwise. 1 S(v id ) = 1+e v id If let x id = 1 means the d th feature is selected, and vice versa, then PSO can be used in feature selection problems. Many researchers have worked on this and many methods have been proposed. Since PSO was proposed, many researchers have tried to improve PSO s performance in many ways. Shi et al. [10] discussed parameter selection problems of PSO, they mainly analyzed the impact of inertial weight and maximum velocity, and they gave guidelines for selecting the two parameters. Clerc and Kennedy discussed how PSO works by analyzing a particle s trajectory, their work suggests ways to alter the standard PSO [11]. Many variations of the standard PSO have been proposed, J. Kennedy replaced velocity of particles by random numbers sampled from a Gaussian distribution with (4) 3348

3 the mean equals to the mean of personal best and global best, the new PSO is called bare bones PSO [12]. More detail about PSO and its variations can be seen in an overview of PSO by Banks et al. [13], [14]. C. The application of PSO in feature subset selection problems Because of PSO s powerful search strength, many researchers have used BPSO as a feature subset generator and achieved good performance. Wang et al. [15] proposed a feature subset selection method based on BPSO and rough set. They designed a fitness function based on rough set and the results show that the proposed algorithm can be successfully used for feature subset selection problems. Besides rough set, fuzzy set has also been used to design a fitness function for BPSO to optimize by Chakraborty [16]. In his work, Chakraborty compared the performance of BPSO and genetic algorithm (GA) with the same fitness function, and the result showed that BPSO outperformed GA. Cervante et al. [17] used BPSO to solve feature subset selection problems in a filter way and they developed two information theory based fitness functions. Li et al. [18] proposed a similar method based on Overlap Information Entropy. Besides standard BPSO, Multi-Objective PSO has also been applied to solve feature subset selection problems [19] and have achieved good result. The similarity among those methods listed above is that they are all filter methods. There are also wrapper feature subset selection methods based on BPSO, Azevedo et al. [20] used support vector machine (SVM) as the learning algorithm and developed a wrapper feature subset selection method based on BPSO and their results show that BPSO based method achieve higher accuracy and use less processing time than GA based method. Wang et al. [21] also used SVM as the learning algorithm, and they proposed one dimension realvalued PSO based on BPSO to improve performance and achieved satisfying results. Chuang et al. [22] takes K-nearest neighbor method as the evaluator and propose improved binary particle swarm optimization as the feature subset generator, their experiment shows that their method can achieve high classification accuracy in most tests. III. OUR PROPOSED ALGORITHM Almost all those existing PSO based feature subset selectors are built following a two-step procedure. The first step is to design a fitness function so that feature subset selection problems can be solved by PSO, and the second step is to select a variation of BPSO. The main characteristic of existing PSO based feature subset selection methods is that the two steps are done individually, there is no interaction between the two steps. In this section, we will propose a BPSO based wrapper feature subset selection algorithm BPSOWFSS following this two-step procedure and the two steps are conducted individually. A. Fitness Function Since our main goal is to maximize classification accuracy, it is better to design the fitness function in a wrapper way [3]. Another goal is to have as fewer features as possible. So we should strike a balance between the two goals. We define the fitness function as follows. Fit(s) = a Er(s)+(1 a) #s #S Where s is a subset of the whole feature set S, Er(s) means the classification error rate of feature subset s, and #s means the number of features in s, #S means the total number of features in S. Here a is a real valued number ranging from 0 to 1, and in our experiment carried, we set a to 0.99, which means that classification accuracy is most important and feature numbers won t matter too much. The goal then turns out to find out the subset which minimizes this fitness function. B. Apply BPSO To Feature Subset Selection Problems Consider a feature subset selection problem with n features. Then a feature subset can be represented by a n bits binary string vector X i = (x 1,x 2,...,x n ) consisted of 0 and 1 If x id is 0, the d th feature is not selected in this subset. If x id is 1, the d th feature is selected in this subset. Each binary string vector X i represents the position of a particle in BPSO. Based on the fitness function designed above, we propose a BPSO based feature subset selection algorithm named BPSOWFSS which is very similar to existing methods. The pseudo code is given below. Algorithm BPSOWFSS 1: Initialize parameters of BPSO 2: Randomly initialize swarm 3: WHILE stopping criterion not met DO 4: calculate each particle s fitness function 5: according to equation (5) 6: For i = 1 to swarmsize DO 7: update the lbest of P i 8: update the pbest of P i 9: END 10: FOR i = 1 to swarmsize DO 11: FOR j = 1 to dimension DO 12: update the velocity of P i according 13: to equation (1) 14: update the position of P i according 15: to equation (3) 16: END 17: END 18: END 19: Return the best feature subset found by the swarm As stated before, in each iteration of BPSOWFSS, the fitness function we designed has to be run for every particle in the swarm and this is the main time-consuming part. In the next section we will analyze this process and propose a new wrapper feature subset selection algorithm with the aim of decreasing running time without sacrificing classification accuracy. The idea we use is to bring in domain knowledge of feature subset selection problems. IV. A FAST DOMAIN KNOWLEDGE BASED APPROACH In section III, we proposed a BPSO based wrapper feature subset selection algorithm BPSOWFSS which can be seen as an representative of other BPSO based wrapper feature subset (5) 3349

4 selection algorithms mentioned in section II. In this section, we will analyze the search process of BPSOWFSS and then define the domain knowledge of feature subset selection problems to optimize the search process, based on which, we propose a new algorithm called Fast BPSOWFSS. A. The Search Process of BPSOWFSS The search process of BPSOWFSS is made up of two parts, the update of particles velocities and the update of particles positions. The update of a particle s velocity is in fact a feedback from the fitness function, indicating the search direction in the next iteration. The update of a particle s position is the process of updating the feature subset represented by the particle in a feature subset selection problem and the process is mainly determined by the update of the particle s velocity, which means the update of particles velocities is a key factor of the search process of BPSOWFSS. The update of a particle s velocity follows equation (1), which has three terms, the previous velocity, the cognitive component, and the social component. We consider the latter two components as a whole component because they both represent the knowledge learned by a particle. As we can see based on equation (1), the knowledge that a particle can learn is the distance between the particle s current position and its personal best position, as well as the best position found by its neighborhood. It is obvious that equation (1) is independent of the problems to be optimized, which means that equation (1) can be used to solve all optimization problems without any change. This is indeed an advantage of PSO, which makes it widely used in many optimization problems. However, when it comes to a specific optimization problem, the best optimization algorithm is always problemdependent according to the no-free-lunch theorem [23], which implies that if we optimize the search process to better fit feature subset selection problems, better performance can be achieved. Our idea is to develop a problem-dependent algorithm by using domain knowledge of feature subset selection problems to optimize the standard search process of PSO. B. Domain Knowledge of Feature Subset Selection Wrappers use the learning algorithm as a black box which returns classification accuracy, however wrappers can not tell why a feature subset can achieve higher classification accuracy than other subsets. Oppositely, filters use technologies like information theory [2] to evaluate a feature subset, which tells that which kind of feature subset may achieve higher classification accuracy, however, filters cannot tell the classification accuracy directly. Imagine when we find some mushrooms in a forest and we want to know whether they can be eaten. However, we have no record of them and we have to try it ourselves (which is dangerous). Good thing is that we know that there are some common sense about how to figure out whether a mushroom is poisonous or not, for example, poisonous mushrooms are usually colorful. This kind of knowledge is domain knowledge and can help a lot even if they can t ensure security for all conditions. If we compare this scene with feature subset selection problems, it is obvious that trying mushrooms directly to find out whether it is poisonous is a wrapper way and the other one is a filter way, which means that what filters provide to us represents the domain knowledge of feature subset selection problems. C. Use Domain Knowledge to Optimize Search Process Our fitness function is designed in a wrapper way, which tells directly the classification accuracy of a candidate feature subset and BPSO modifies its search direction base on the fitness function. However, we think that using classification accuracy to guide the search process only is not enough. Domain knowledge of feature subset selection problems can help by telling BPSO which feature is more likely to be selected in the final feature subset and which feature is less like to be selected at last. This will decrease the size of the search space because PSO will pay more attention to good features and less attention to bad features, which means some feature subset will unlikely be searched. As a result the search process will be speeded up. Besides, it is the fitness function which mainly focuses on classification accuracy that leads the direction of particles, so the overall classification accuracy obtained at last won t be sacrificed. In equation (3), a feature with higher velocity is more likely to be selected, so our idea is to use domain knowledge to evaluate a feature and then increase its velocity or decrease its velocity depending on its assessed value. This can be represented like follows. { 1, rand < S(vid +Score(x x id = id )) (6) 0, otherwise. Where Score(x id ) means the assessed value of the feature represented by x id. Score() can be designed in many ways. In the following content, we will propose Fast BPSOWFSS which designed a Score(), which use the concept of relevance and redundancy to optimize the search process of BPSOWFSS. D. Fast BPSOWFSS We design Score() with the idea of a feature with high relevance with the class and low redundancy with other features in a feature subset should have higher possibility to be selected into the feature subset than features with low relevance or high redundancy or both of them. Relevance and redundancy can be measured in terms of mutual information [24], [25]. We design Score() based on mutual information as follows. Score(F i,s) = Relevance(F i,c) Redundancy(F i,s) (7) Relevance(F i,c) = IG(IG(F i,c)) (8) Redundancy(F i,s) = 1 IG(F i,f j ) #S (9) F i S Where C is the class feature, F i is the feature currently being evaluated, S is a feature subset, #S means the number of features in the feature subset S, and IG(F i,c) is the information gain of feature F i. From the equations shown above, it is easy to see that Score(F i,s) means the relevance of feature F i minus feature F i s average redundancy with each feature in feature subset S. The value of Score(F i,s) means the relevance and redundancy F i will bring to feature subset S if F i is selected. 3350

5 The higher Score(F i,s) is,the better we think feature F i is, and vice versa. Adding Score() to the velocity of particles, features with higher Score() will get higher possibility to be selected while feature with lower Score() will be less likely to be selected. E. Implementation Issues Our goal is to speed up the search process without sacrificing final classification accuracy, however, the function Score() we designed to evaluate a single feature will bring additional computation time to our algorithm. We solve this problem in our Fast BPSOWFSS implementation by calculating a feature s relevance with the class feature and its redundancy with other features before the iteration process of BPSO because that the calculation of IG(F i,c)and IG(F i,f j ) is independent of learning algorithms and will keep unchanged during the iteration process. Before the iteration process of PSO, we calculate IG(F i,c) for every feature F i, and calculate IG(F i,f j ) for every pair of feature F i and feature F j. Then during the iteration process, the calculation of Score() become an easy task because all variables it needed has already been calculated. The time required to calculate the Score() is negligible compared to the total running time consumed by an iteration process. By doing so, we are able to compare the search speed of BPSOWFSS and Fast BPSOWFSS by comparing their iteration times. V. EXPERIMENTS AND RESULTS In this section, we design a set of experiments to prove that our idea can work well. We choose ten commonly used datasets, and run both our two algorithms on them and get their running time on each dataset, as well as the classification accuracy. By comparing the results, we want to show that Fast BPSOWFSS can improve both the running time and classification accuracy. A. DataSet We choose ten UCI datasets [26] and all of them are nominal datasets. Some datasets have already been divided into training set and testing set while others have not. If a dataset hasn t been split, a ten-fold cross validation (CV) will be taken to performance classification. The detailed information of these datasets can be seen in TABLE I. TABLE I. DATASET Dataset Feature Number Instance Number Base Accuracy Breast-cancer D crx Monk / Monk / Monk / Corral Nursery Soybean Spect 64 80/ Tic-tac-toe Feature Number means the total number of features excluding the class feature in a dataset. Instance Number means the total number of instances in a dataset. If a dataset has both training set and test set, then there will be two numbers divided by / in this column, and the formal one is the number of instances in the training set and the other one is that of test set. Base Accuracy means the classification obtained with all features in by the learning algorithm used in the experiment. B. Experiment Design As our fitness function is designed in a wrapper way, a learning algorithm is needed. Here we choose a decision tree algorithm C4.5, which is provided as J48 in Weka [27]. Parameter selection is the same for both the two algorithms, the inertial weight w is set as , c 1 and c 2 are set as , swarm size is set as 20 [28], and every particle has at most 3 neighbors including itself which changes in every iteration. Each algorithm will run 100 times on a dataset and we get averaged results. In this experiment, we mainly focus on two measures, classification accuracy and running time and classification accuracy will be measured by classification error rate in our experiment. TABLE II. ITERATION TIMES Dataset BPSOWFSS Fast BPSOWFSS Improvement Breast-cancer % D crx % Monk % Monk % Monk % Corral % Nursery % Soybean % Spect % Tic-tac-toe % TABLE III. CLASSIFICATION ERROR RATE Dataset Ave-Err Std-Err BPSOWFSS Fast BPSOWFSS BPSOWFSS Fast BPSOWFSS Breast-cancer D crx Monk Monk Monk Corral Nursery Soybean Spect Tic-tac-toe C. Results and analyses From TABLE II and TABLE III, we can see that Fast BPSOWFSS outperforms BPSOWFSS in almost all the cases both in running time and classification accuracy. In TABLE III, Ave-Err represents the average classification error rate and Std-Err means the standard deviation value of classification error rate. The iteration times used by the two algorithm is shown in TABLE II and Fig. 2. Iteration times given is the iteration times needed for an algorithm to find the optimal feature subset. From the result we can see that in all datasets, Fast BPSOWFSS use less iterations than BPSOWFSS to find the optimal feature subset, and in 7 out of 10 cases, the advantage of Fast BPSOWFSS is obvious. On all 10 datasets, the iteration times Fast BPSOWFSS used is 18.2% less than BPSOWFSS averagely. On datasets Monk1, Monk2, Monk3, the iteration times used by the two algorithms are equal, however, as we 3351

6 Fig. 2. Fig. 3. The iteration times of two algorithms Classification error rate of two algorithms and base error rate will see in the following, Fast BPSOWFSS achieved lower classification error rate on the three datasets. TABLE III and Fig. 3 show the final classification error rate obtained by the two algorithms. First of all, both BPSOWFSS and Fast BPSOWFSS can decrease classification error greatly. Our goal is to speed up the search speed of PSO without sacrificing classification accuracy, and from the results, we can see that our goal is achieved. On 9 out of 10 datasets, Fast BPSOWFSS achieved higher classification accuracy than BPSOWFSS. On dataset Tic-tac-toe, BPSOWFSS achieved higher classification accuracy, however, the result obtain on this dataset is so close that we think their results are equally good. In fact, although Fast BPSOWFSS obtained lower classification error rate on 9 out 10 datasets, the gap is not obvious in most cases, which can be seen from Fig. 3. Datasets Corral, Monk1, Monk2, Monk3 and Soybean are the 5 datasets that improvement can be seen clearly on. On all 10 datasets, Fast BPSOWFSS achieved 4.6% lower classification error rate than BPSOWFSS averagely. Besides, the standard deviation values are small in all 10 datasets, which means the two algorithms are stable, and it can be seen that Fast BPSOWFSS is more stable because it has smaller standard deviation values. From the results we obtained, our goal of speeding up the search process of PSO without sacrificing classification accuracy has been met. In fact, Fast BPSOWFSS even outperformed BPSOWFSS regarding classification accuracy, although the advantage is not obvious. VI. CONCLUSION AND FUTURE WORK In this paper, we investigated the application of PSO in feature subset selection problems. By analyzing the way of appling PSO into feature subset selection problems of most proposed PSO based feature subset selection methods, we find that existing methods tend to ignore characteristics of feature subset selection problems. We propose a new method, in which the search process of PSO is optimized by using domain knowledge of feature subset selection problems. The main idea is to evaluate a feature subset in a wrapper way and design PSO s fitness function based on that, meanwhile the feature subset updating process is improved by filter measure, where a strongly relevant feature has more chance to be selected and a redundant or irrelevant feature has less chance to be selected. Besides, we solved the implementation issue brought by additional function to be calculated. In order to prove that our method works, we designed experiments and results satisfied our expectations. Although the results are satisfying, we think there is still some room for improvements. The idea to use domain knowledge of feature subset selection problems to help with PSO s search process worthy a lot more feature work. In our current work, we use the idea of maximize relevance and minimize redundancy. In fact, there are many other ways to evaluate a single feature and they all can be used to enhance PSO s search ability in a feature subset selection problem, it is worthy to do some research on their performance. Another work needs to be done is how important the domain knowledge should be considered in our method. Currently, we simply add it to the velocity of a particle. There should be a balance point, which is also worthy of researching. Besides, we think the advantage of Fast BPSOWFSS will be more significant on bigger datasets with a large number of feautes, so experiments on bigger datasets should be carried out in future work. ACKNOWLEDGMENT We would like to acknowledge the support from the National Science Foundation of China (Nos , ), and the Key Program of Natural Science Foundation of Jiangsu Province, China (No.BK ). REFERENCES [1] H. Liu and H. Motoda, Feature selection for knowledge discovery and data mining. Springer, [2] I. Guyon and A. Elisseeff, An introduction to variable and feature selection, The Journal of Machine Learning Research., vol. 3, pp , [3] R. Kohavi and G. H. John, Wrappers for feature subset selection, Artificial Intelligence., vol. 97, no. 1, pp , [4] J. Kennedy and R. Eberhart, Particle swarm optimization, in Proc. IEEE International Conference on Neural Networks, vol. 4, 1995, pp [5] R. Kohavi and G. H. John, A survey on particle swarm optimization in feature selection, in Global Trends in Information Systems and Software Applications. Springer, 2012, pp [6] H. Almuallim and T. G. Dietterich, Learning with many irrelevant features, in Proceedings of the ninth National conference on Artificial Intelligence, vol. 2, 1991, pp [7] K. Kira and L. A. Rendcll, The feature selection problem : Traditional methods and a new algorithm, in Proceedings of the National Conference on Artificial Intelligence (AAAI 92), 1992, pp [8] L. Yu and H. Liu, Feature selection for high-dimensional data: A fast correlation-based filter solution, in MACHINE LEARNING- INTERNATIONAL WORKSHOP THEN CONFERENCE, 2003, pp [9] J. Kennedy and R. Eberhart, A discrete binary version of the particle swarm algorithm, in Proc. IEEE International Conference on Systems, Man, and Cybernetics, Computational Cybernetics and Simulation, vol. 5, 1997, pp

7 [10] Y. Shi and R. Eberhart, Parameter selection in particle swarm optimization, in Evolutionary Programming VII, 1998, pp [11] M. Clerc and J. Kennedy, The particle swarm-explosion, stability, and convergence in a multidimensional complex space, IEEE Transactions on Evolutionary Computation., vol. 6, no. 1, pp , [12] J. Kennedy, Bare bones particle swarms, in Swarm Intelligence Symposium (SIS 03), 2003, pp [13] A. Banks, J. Vincent, and C. Anyakoha, A review of particle swarm optimization. part i: background and development, Natural Computing., vol. 6, no. 4, pp , [14], A review of particle swarm optimization. part ii: hybridsation, combinatorial, multicriteria and constrained optimization, and indicative applications, Natural Computing., vol. 7, no. 1, pp , [15] X. Wang, J. Yang, X. Teng, W. Xia, and R. Jensen, Feature selection based on rough sets and praticle swarm optimization, Pattern Recognition Letters., vol. 28, no. 4, pp , [16] B. Chakraborty, Feature subset selection by particle swarm optimization with fuzzy fitness function, in Proceedings3rd International Conference on Intelligent System and Knowledge Engineering (ISKE 08)), vol. 1, 2008, pp [17] L. Cervante, B. Xue, M. Zhang, and L. Shang, Binary particle swarm optimisation for feature selection: A filter based approach, in Proc. IEEE Congress on Evolutionary Computation (CEC 12), 2012, pp [18] A. Li and B. Wang, Feature subset selection based on binary particle swarm optimization and overlap information entropy, in International Conference on Computational Intelligence and Software Engineering (CiSE 09), 2009, pp [19] B. Xue, M. Zhang, and W. N. Browne, Multi-objective particle swarm optimisation (pso) for feature selection, in Proceedings of the fourteenth international conference on Genetic and evolutionary computation conference, 2012, pp [20] G. L. F. B. G. Azevedo, G. D. C. Cavalcanti, and E. C. B. C. Filho, An approach to feature selection for keystroke dynamics systems based on pso and feature weighting, in Proc. IEEE Congress on Evolutionary Computation (CEC 97), 2007, pp [21] J. Wang, Y. Zhao, and P. Liu, Effective feature selection with particle swarm optimization based one-dimension searching, in 3rd International Symposium on Systems and Control in Aeronautics and Astronautics (ISSCAA), 2010, pp [22] L. Chuang, H. Chang, C. Tu, and C. Yang, Improved binary pso for feature selection using gene expression data, Computational Biology and Chemistry., vol. 32, no. 1, pp , [23] D. H. Wolpert and W. G. Macready, No free lunch theorems for optimization, IEEE Transactions on Evolutionary Computation., vol. 1, no. 1, pp , [24] H. C. Peng, F. H. Long, and C. Ding, Feature selection based on mutual information criteria of max-dependency, max-relevance, and min-redundancy, IEEE Transactions on Pattern Analysis and Machine Intelligence., vol. 27, no. 8, pp , [25] L. Yu and H. Liu, Efficient feature selection via analysis of relevance and redundancy, The Journal of Machine Learning Research., vol. 25, pp , [26] A. Frank and A. Asuncion, Uci machine learning repository, [27] M. Hall, E. Frand, G. Holms, B. Pfahringer, P. Reutemann, and I. H. Witten, The weka data mining software: An update, SIGKDD Explorations., vol. 11, no. 1, pp , [28] M. Clerc, Particle swarm optimization. Wiley-ISTE,

Novel Initialisation and Updating Mechanisms in PSO for Feature Selection in Classification

Novel Initialisation and Updating Mechanisms in PSO for Feature Selection in Classification Novel Initialisation and Updating Mechanisms in PSO for Feature Selection in Classification Bing Xue, Mengjie Zhang, and Will N. Browne School of Engineering and Computer Science Victoria University of

More information

Modified Particle Swarm Optimization

Modified Particle Swarm Optimization Modified Particle Swarm Optimization Swati Agrawal 1, R.P. Shimpi 2 1 Aerospace Engineering Department, IIT Bombay, Mumbai, India, swati.agrawal@iitb.ac.in 2 Aerospace Engineering Department, IIT Bombay,

More information

Feature weighting using particle swarm optimization for learning vector quantization classifier

Feature weighting using particle swarm optimization for learning vector quantization classifier Journal of Physics: Conference Series PAPER OPEN ACCESS Feature weighting using particle swarm optimization for learning vector quantization classifier To cite this article: A Dongoran et al 2018 J. Phys.:

More information

Inertia Weight. v i = ωv i +φ 1 R(0,1)(p i x i )+φ 2 R(0,1)(p g x i ) The new velocity update equation:

Inertia Weight. v i = ωv i +φ 1 R(0,1)(p i x i )+φ 2 R(0,1)(p g x i ) The new velocity update equation: Convergence of PSO The velocity update equation: v i = v i +φ 1 R(0,1)(p i x i )+φ 2 R(0,1)(p g x i ) for some values of φ 1 and φ 2 the velocity grows without bound can bound velocity to range [ V max,v

More information

Argha Roy* Dept. of CSE Netaji Subhash Engg. College West Bengal, India.

Argha Roy* Dept. of CSE Netaji Subhash Engg. College West Bengal, India. Volume 3, Issue 3, March 2013 ISSN: 2277 128X International Journal of Advanced Research in Computer Science and Software Engineering Research Paper Available online at: www.ijarcsse.com Training Artificial

More information

An improved PID neural network controller for long time delay systems using particle swarm optimization algorithm

An improved PID neural network controller for long time delay systems using particle swarm optimization algorithm An improved PID neural network controller for long time delay systems using particle swarm optimization algorithm A. Lari, A. Khosravi and A. Alfi Faculty of Electrical and Computer Engineering, Noushirvani

More information

A *69>H>N6 #DJGC6A DG C<>C::G>C<,8>:C8:H /DA 'D 2:6G, ()-"&"3 -"(' ( +-" " " % '.+ % ' -0(+$,

A *69>H>N6 #DJGC6A DG C<>C::G>C<,8>:C8:H /DA 'D 2:6G, ()-&3 -(' ( +-   % '.+ % ' -0(+$, The structure is a very important aspect in neural network design, it is not only impossible to determine an optimal structure for a given problem, it is even impossible to prove that a given structure

More information

Feature Selection Algorithm with Discretization and PSO Search Methods for Continuous Attributes

Feature Selection Algorithm with Discretization and PSO Search Methods for Continuous Attributes Feature Selection Algorithm with Discretization and PSO Search Methods for Continuous Attributes Madhu.G 1, Rajinikanth.T.V 2, Govardhan.A 3 1 Dept of Information Technology, VNRVJIET, Hyderabad-90, INDIA,

More information

[Kaur, 5(8): August 2018] ISSN DOI /zenodo Impact Factor

[Kaur, 5(8): August 2018] ISSN DOI /zenodo Impact Factor GLOBAL JOURNAL OF ENGINEERING SCIENCE AND RESEARCHES EVOLUTIONARY METAHEURISTIC ALGORITHMS FOR FEATURE SELECTION: A SURVEY Sandeep Kaur *1 & Vinay Chopra 2 *1 Research Scholar, Computer Science and Engineering,

More information

IMPROVING THE PARTICLE SWARM OPTIMIZATION ALGORITHM USING THE SIMPLEX METHOD AT LATE STAGE

IMPROVING THE PARTICLE SWARM OPTIMIZATION ALGORITHM USING THE SIMPLEX METHOD AT LATE STAGE IMPROVING THE PARTICLE SWARM OPTIMIZATION ALGORITHM USING THE SIMPLEX METHOD AT LATE STAGE Fang Wang, and Yuhui Qiu Intelligent Software and Software Engineering Laboratory, Southwest-China Normal University,

More information

Mobile Robot Path Planning in Static Environments using Particle Swarm Optimization

Mobile Robot Path Planning in Static Environments using Particle Swarm Optimization Mobile Robot Path Planning in Static Environments using Particle Swarm Optimization M. Shahab Alam, M. Usman Rafique, and M. Umer Khan Abstract Motion planning is a key element of robotics since it empowers

More information

Improved PSO for Feature Selection on High-Dimensional Datasets

Improved PSO for Feature Selection on High-Dimensional Datasets Improved PSO for Feature Selection on High-Dimensional Datasets Binh Tran, Bing Xue, and Mengjie Zhang Victoria University of Wellington, PO Box 600, Wellington 6140, New Zealand {binh.tran,bing.xue,mengjie.zhang}@ecs.vuw.ac.nz

More information

Mutual Information with PSO for Feature Selection

Mutual Information with PSO for Feature Selection Mutual Information with PSO for Feature Selection S. Sivakumar #1, Dr.C.Chandrasekar *2 #* Department of Computer Science, Periyar University Salem-11, Tamilnadu, India 1 ssivakkumarr@yahoo.com 2 ccsekar@gmail.com

More information

CHAPTER 6 HYBRID AI BASED IMAGE CLASSIFICATION TECHNIQUES

CHAPTER 6 HYBRID AI BASED IMAGE CLASSIFICATION TECHNIQUES CHAPTER 6 HYBRID AI BASED IMAGE CLASSIFICATION TECHNIQUES 6.1 INTRODUCTION The exploration of applications of ANN for image classification has yielded satisfactory results. But, the scope for improving

More information

Wrapper Feature Selection using Discrete Cuckoo Optimization Algorithm Abstract S.J. Mousavirad and H. Ebrahimpour-Komleh* 1 Department of Computer and Electrical Engineering, University of Kashan, Kashan,

More information

Estimating Missing Attribute Values Using Dynamically-Ordered Attribute Trees

Estimating Missing Attribute Values Using Dynamically-Ordered Attribute Trees Estimating Missing Attribute Values Using Dynamically-Ordered Attribute Trees Jing Wang Computer Science Department, The University of Iowa jing-wang-1@uiowa.edu W. Nick Street Management Sciences Department,

More information

A MULTI-SWARM PARTICLE SWARM OPTIMIZATION WITH LOCAL SEARCH ON MULTI-ROBOT SEARCH SYSTEM

A MULTI-SWARM PARTICLE SWARM OPTIMIZATION WITH LOCAL SEARCH ON MULTI-ROBOT SEARCH SYSTEM A MULTI-SWARM PARTICLE SWARM OPTIMIZATION WITH LOCAL SEARCH ON MULTI-ROBOT SEARCH SYSTEM BAHAREH NAKISA, MOHAMMAD NAIM RASTGOO, MOHAMMAD FAIDZUL NASRUDIN, MOHD ZAKREE AHMAD NAZRI Department of Computer

More information

Hamming Distance based Binary PSO for Feature Selection and Classification from high dimensional Gene Expression Data

Hamming Distance based Binary PSO for Feature Selection and Classification from high dimensional Gene Expression Data Hamming Distance based Binary PSO for Feature Selection and Classification from high dimensional Gene Expression Data Haider Banka and Suresh Dara Department of Computer Science and Engineering Indian

More information

Meta- Heuristic based Optimization Algorithms: A Comparative Study of Genetic Algorithm and Particle Swarm Optimization

Meta- Heuristic based Optimization Algorithms: A Comparative Study of Genetic Algorithm and Particle Swarm Optimization 2017 2 nd International Electrical Engineering Conference (IEEC 2017) May. 19 th -20 th, 2017 at IEP Centre, Karachi, Pakistan Meta- Heuristic based Optimization Algorithms: A Comparative Study of Genetic

More information

Improving Tree-Based Classification Rules Using a Particle Swarm Optimization

Improving Tree-Based Classification Rules Using a Particle Swarm Optimization Improving Tree-Based Classification Rules Using a Particle Swarm Optimization Chi-Hyuck Jun *, Yun-Ju Cho, and Hyeseon Lee Department of Industrial and Management Engineering Pohang University of Science

More information

Particle Swarm Optimization applied to Pattern Recognition

Particle Swarm Optimization applied to Pattern Recognition Particle Swarm Optimization applied to Pattern Recognition by Abel Mengistu Advisor: Dr. Raheel Ahmad CS Senior Research 2011 Manchester College May, 2011-1 - Table of Contents Introduction... - 3 - Objectives...

More information

Discrete Particle Swarm Optimization With Local Search Strategy for Rule Classification

Discrete Particle Swarm Optimization With Local Search Strategy for Rule Classification Discrete Particle Swarm Optimization With Local Search Strategy for Rule Classification Min Chen and Simone A. Ludwig Department of Computer Science North Dakota State University Fargo, ND, USA min.chen@my.ndsu.edu,

More information

Fuzzy Entropy based feature selection for classification of hyperspectral data

Fuzzy Entropy based feature selection for classification of hyperspectral data Fuzzy Entropy based feature selection for classification of hyperspectral data Mahesh Pal Department of Civil Engineering NIT Kurukshetra, 136119 mpce_pal@yahoo.co.uk Abstract: This paper proposes to use

More information

Discrete Particle Swarm Optimization for TSP based on Neighborhood

Discrete Particle Swarm Optimization for TSP based on Neighborhood Journal of Computational Information Systems 6:0 (200) 3407-344 Available at http://www.jofcis.com Discrete Particle Swarm Optimization for TSP based on Neighborhood Huilian FAN School of Mathematics and

More information

Traffic Signal Control Based On Fuzzy Artificial Neural Networks With Particle Swarm Optimization

Traffic Signal Control Based On Fuzzy Artificial Neural Networks With Particle Swarm Optimization Traffic Signal Control Based On Fuzzy Artificial Neural Networks With Particle Swarm Optimization J.Venkatesh 1, B.Chiranjeevulu 2 1 PG Student, Dept. of ECE, Viswanadha Institute of Technology And Management,

More information

A PSO-based Generic Classifier Design and Weka Implementation Study

A PSO-based Generic Classifier Design and Weka Implementation Study International Forum on Mechanical, Control and Automation (IFMCA 16) A PSO-based Generic Classifier Design and Weka Implementation Study Hui HU1, a Xiaodong MAO1, b Qin XI1, c 1 School of Economics and

More information

Particle Swarm Optimization

Particle Swarm Optimization Particle Swarm Optimization Gonçalo Pereira INESC-ID and Instituto Superior Técnico Porto Salvo, Portugal gpereira@gaips.inesc-id.pt April 15, 2011 1 What is it? Particle Swarm Optimization is an algorithm

More information

PARTICLE SWARM OPTIMIZATION (PSO)

PARTICLE SWARM OPTIMIZATION (PSO) PARTICLE SWARM OPTIMIZATION (PSO) J. Kennedy and R. Eberhart, Particle Swarm Optimization. Proceedings of the Fourth IEEE Int. Conference on Neural Networks, 1995. A population based optimization technique

More information

Minimal Test Cost Feature Selection with Positive Region Constraint

Minimal Test Cost Feature Selection with Positive Region Constraint Minimal Test Cost Feature Selection with Positive Region Constraint Jiabin Liu 1,2,FanMin 2,, Shujiao Liao 2, and William Zhu 2 1 Department of Computer Science, Sichuan University for Nationalities, Kangding

More information

Feature Selection. CE-725: Statistical Pattern Recognition Sharif University of Technology Spring Soleymani

Feature Selection. CE-725: Statistical Pattern Recognition Sharif University of Technology Spring Soleymani Feature Selection CE-725: Statistical Pattern Recognition Sharif University of Technology Spring 2013 Soleymani Outline Dimensionality reduction Feature selection vs. feature extraction Filter univariate

More information

Cell-to-switch assignment in. cellular networks. barebones particle swarm optimization

Cell-to-switch assignment in. cellular networks. barebones particle swarm optimization Cell-to-switch assignment in cellular networks using barebones particle swarm optimization Sotirios K. Goudos a), Konstantinos B. Baltzis, Christos Bachtsevanidis, and John N. Sahalos RadioCommunications

More information

Handling Multi Objectives of with Multi Objective Dynamic Particle Swarm Optimization

Handling Multi Objectives of with Multi Objective Dynamic Particle Swarm Optimization Handling Multi Objectives of with Multi Objective Dynamic Particle Swarm Optimization Richa Agnihotri #1, Dr. Shikha Agrawal #1, Dr. Rajeev Pandey #1 # Department of Computer Science Engineering, UIT,

More information

Overview of Particle Swarm Optimisation for Feature Selection in Classification

Overview of Particle Swarm Optimisation for Feature Selection in Classification Overview of Particle Swarm Optimisation for Feature Selection in Classification Binh Tran, Bing Xue, and Mengjie Zhang Victoria University of Wellington, PO Box 600, Wellington 6140, New Zealand {tran.binh,bing.xue,mengjie.zhang}@ecs.vuw.ac.nz

More information

Research Article Path Planning Using a Hybrid Evolutionary Algorithm Based on Tree Structure Encoding

Research Article Path Planning Using a Hybrid Evolutionary Algorithm Based on Tree Structure Encoding e Scientific World Journal, Article ID 746260, 8 pages http://dx.doi.org/10.1155/2014/746260 Research Article Path Planning Using a Hybrid Evolutionary Algorithm Based on Tree Structure Encoding Ming-Yi

More information

GENETIC ALGORITHM VERSUS PARTICLE SWARM OPTIMIZATION IN N-QUEEN PROBLEM

GENETIC ALGORITHM VERSUS PARTICLE SWARM OPTIMIZATION IN N-QUEEN PROBLEM Journal of Al-Nahrain University Vol.10(2), December, 2007, pp.172-177 Science GENETIC ALGORITHM VERSUS PARTICLE SWARM OPTIMIZATION IN N-QUEEN PROBLEM * Azhar W. Hammad, ** Dr. Ban N. Thannoon Al-Nahrain

More information

BENCHMARKING ATTRIBUTE SELECTION TECHNIQUES FOR MICROARRAY DATA

BENCHMARKING ATTRIBUTE SELECTION TECHNIQUES FOR MICROARRAY DATA BENCHMARKING ATTRIBUTE SELECTION TECHNIQUES FOR MICROARRAY DATA S. DeepaLakshmi 1 and T. Velmurugan 2 1 Bharathiar University, Coimbatore, India 2 Department of Computer Science, D. G. Vaishnav College,

More information

Three-Dimensional Off-Line Path Planning for Unmanned Aerial Vehicle Using Modified Particle Swarm Optimization

Three-Dimensional Off-Line Path Planning for Unmanned Aerial Vehicle Using Modified Particle Swarm Optimization Three-Dimensional Off-Line Path Planning for Unmanned Aerial Vehicle Using Modified Particle Swarm Optimization Lana Dalawr Jalal Abstract This paper addresses the problem of offline path planning for

More information

QUANTUM BASED PSO TECHNIQUE FOR IMAGE SEGMENTATION

QUANTUM BASED PSO TECHNIQUE FOR IMAGE SEGMENTATION International Journal of Computer Engineering and Applications, Volume VIII, Issue I, Part I, October 14 QUANTUM BASED PSO TECHNIQUE FOR IMAGE SEGMENTATION Shradha Chawla 1, Vivek Panwar 2 1 Department

More information

A Naïve Soft Computing based Approach for Gene Expression Data Analysis

A Naïve Soft Computing based Approach for Gene Expression Data Analysis Available online at www.sciencedirect.com Procedia Engineering 38 (2012 ) 2124 2128 International Conference on Modeling Optimization and Computing (ICMOC-2012) A Naïve Soft Computing based Approach for

More information

Comparison of Some Evolutionary Algorithms for Approximate Solutions of Optimal Control Problems

Comparison of Some Evolutionary Algorithms for Approximate Solutions of Optimal Control Problems Australian Journal of Basic and Applied Sciences, 4(8): 3366-3382, 21 ISSN 1991-8178 Comparison of Some Evolutionary Algorithms for Approximate Solutions of Optimal Control Problems Akbar H. Borzabadi,

More information

Convolutional Code Optimization for Various Constraint Lengths using PSO

Convolutional Code Optimization for Various Constraint Lengths using PSO International Journal of Electronics and Communication Engineering. ISSN 0974-2166 Volume 5, Number 2 (2012), pp. 151-157 International Research Publication House http://www.irphouse.com Convolutional

More information

An Empirical Study on feature selection for Data Classification

An Empirical Study on feature selection for Data Classification An Empirical Study on feature selection for Data Classification S.Rajarajeswari 1, K.Somasundaram 2 Department of Computer Science, M.S.Ramaiah Institute of Technology, Bangalore, India 1 Department of

More information

Feeder Reconfiguration Using Binary Coding Particle Swarm Optimization

Feeder Reconfiguration Using Binary Coding Particle Swarm Optimization 488 International Journal Wu-Chang of Control, Wu Automation, and Men-Shen and Systems, Tsai vol. 6, no. 4, pp. 488-494, August 2008 Feeder Reconfiguration Using Binary Coding Particle Swarm Optimization

More information

Particle Swarm Optimization Approach for Scheduling of Flexible Job Shops

Particle Swarm Optimization Approach for Scheduling of Flexible Job Shops Particle Swarm Optimization Approach for Scheduling of Flexible Job Shops 1 Srinivas P. S., 2 Ramachandra Raju V., 3 C.S.P Rao. 1 Associate Professor, V. R. Sdhartha Engineering College, Vijayawada 2 Professor,

More information

Tracking Changing Extrema with Particle Swarm Optimizer

Tracking Changing Extrema with Particle Swarm Optimizer Tracking Changing Extrema with Particle Swarm Optimizer Anthony Carlisle Department of Mathematical and Computer Sciences, Huntingdon College antho@huntingdon.edu Abstract The modification of the Particle

More information

An Enhanced Binary Particle Swarm Optimization (EBPSO) Algorithm Based A V- shaped Transfer Function for Feature Selection in High Dimensional data

An Enhanced Binary Particle Swarm Optimization (EBPSO) Algorithm Based A V- shaped Transfer Function for Feature Selection in High Dimensional data Int. J. Advance Soft Compu. Appl, Vol. 9, No. 3, Nov 2017 ISSN 2074-8523 An Enhanced Binary Particle Swarm Optimization (EBPSO) Algorithm Based A V- shaped Transfer Function for Feature Selection in High

More information

FEATURE SELECTION USING PARTICLE SWARM OPTIMIZATION IN TEXT CATEGORIZATION

FEATURE SELECTION USING PARTICLE SWARM OPTIMIZATION IN TEXT CATEGORIZATION JAISCR, 2015, Vol. 5, No. 4, pp. 231 238 10.1515/jaiscr-2015-0031 FEATURE SELECTION USING PARTICLE SWARM OPTIMIZATION IN TEXT CATEGORIZATION Mehdi Hosseinzadeh Aghdam 1 and Setareh Heidari 2 1 Department

More information

Adaptative Clustering Particle Swarm Optimization

Adaptative Clustering Particle Swarm Optimization Adaptative Clustering Particle Swarm Optimization Salomão S. Madeiro, Carmelo J. A. Bastos-Filho, Member, IEEE, and Fernando B. Lima Neto, Senior Member, IEEE, Elliackin M. N. Figueiredo Abstract The performance

More information

LECTURE 16: SWARM INTELLIGENCE 2 / PARTICLE SWARM OPTIMIZATION 2

LECTURE 16: SWARM INTELLIGENCE 2 / PARTICLE SWARM OPTIMIZATION 2 15-382 COLLECTIVE INTELLIGENCE - S18 LECTURE 16: SWARM INTELLIGENCE 2 / PARTICLE SWARM OPTIMIZATION 2 INSTRUCTOR: GIANNI A. DI CARO BACKGROUND: REYNOLDS BOIDS Reynolds created a model of coordinated animal

More information

Index Terms PSO, parallel computing, clustering, multiprocessor.

Index Terms PSO, parallel computing, clustering, multiprocessor. Parallel Particle Swarm Optimization in Data Clustering Yasin ORTAKCI Karabuk University, Computer Engineering Department, Karabuk, Turkey yasinortakci@karabuk.edu.tr Abstract Particle Swarm Optimization

More information

Reconfiguration Optimization for Loss Reduction in Distribution Networks using Hybrid PSO algorithm and Fuzzy logic

Reconfiguration Optimization for Loss Reduction in Distribution Networks using Hybrid PSO algorithm and Fuzzy logic Bulletin of Environment, Pharmacology and Life Sciences Bull. Env. Pharmacol. Life Sci., Vol 4 [9] August 2015: 115-120 2015 Academy for Environment and Life Sciences, India Online ISSN 2277-1808 Journal

More information

Hybrid Particle Swarm-Based-Simulated Annealing Optimization Techniques

Hybrid Particle Swarm-Based-Simulated Annealing Optimization Techniques Hybrid Particle Swarm-Based-Simulated Annealing Optimization Techniques Nasser Sadati Abstract Particle Swarm Optimization (PSO) algorithms recently invented as intelligent optimizers with several highly

More information

Optimized Algorithm for Particle Swarm Optimization

Optimized Algorithm for Particle Swarm Optimization Optimized Algorithm for Particle Swarm Optimization Fuzhang Zhao Abstract Particle swarm optimization (PSO) is becoming one of the most important swarm intelligent paradigms for solving global optimization

More information

The Comparative Study of Machine Learning Algorithms in Text Data Classification*

The Comparative Study of Machine Learning Algorithms in Text Data Classification* The Comparative Study of Machine Learning Algorithms in Text Data Classification* Wang Xin School of Science, Beijing Information Science and Technology University Beijing, China Abstract Classification

More information

Binary Differential Evolution Strategies

Binary Differential Evolution Strategies Binary Differential Evolution Strategies A.P. Engelbrecht, Member, IEEE G. Pampará Abstract Differential evolution has shown to be a very powerful, yet simple, population-based optimization approach. The

More information

Particle swarm optimization for mobile network design

Particle swarm optimization for mobile network design Particle swarm optimization for mobile network design Ayman A. El-Saleh 1,2a), Mahamod Ismail 1, R. Viknesh 2, C. C. Mark 2, and M. L. Chan 2 1 Department of Electrical, Electronics, and Systems Engineering,

More information

PARTICLE SWARM OPTIMIZATION (PSO) [1] is an

PARTICLE SWARM OPTIMIZATION (PSO) [1] is an Proceedings of International Joint Conference on Neural Netorks, Atlanta, Georgia, USA, June -9, 9 Netork-Structured Particle Sarm Optimizer Considering Neighborhood Relationships Haruna Matsushita and

More information

Discrete Multi-Valued Particle Swarm Optimization

Discrete Multi-Valued Particle Swarm Optimization Discrete Multi-d Particle Swarm Optimization Jim Pugh and Alcherio Martinoli Swarm-Intelligent Systems Group École Polytechnique Fédérale de Lausanne 05 Lausanne, Switzerland Email: {jim.pugh,alcherio.martinoli}@epfl.ch

More information

Forward Feature Selection Using Residual Mutual Information

Forward Feature Selection Using Residual Mutual Information Forward Feature Selection Using Residual Mutual Information Erik Schaffernicht, Christoph Möller, Klaus Debes and Horst-Michael Gross Ilmenau University of Technology - Neuroinformatics and Cognitive Robotics

More information

Feature Selection with Decision Tree Criterion

Feature Selection with Decision Tree Criterion Feature Selection with Decision Tree Criterion Krzysztof Grąbczewski and Norbert Jankowski Department of Computer Methods Nicolaus Copernicus University Toruń, Poland kgrabcze,norbert@phys.uni.torun.pl

More information

Feature-weighted k-nearest Neighbor Classifier

Feature-weighted k-nearest Neighbor Classifier Proceedings of the 27 IEEE Symposium on Foundations of Computational Intelligence (FOCI 27) Feature-weighted k-nearest Neighbor Classifier Diego P. Vivencio vivencio@comp.uf scar.br Estevam R. Hruschka

More information

EFFICIENT ATTRIBUTE REDUCTION ALGORITHM

EFFICIENT ATTRIBUTE REDUCTION ALGORITHM EFFICIENT ATTRIBUTE REDUCTION ALGORITHM Zhongzhi Shi, Shaohui Liu, Zheng Zheng Institute Of Computing Technology,Chinese Academy of Sciences, Beijing, China Abstract: Key words: Efficiency of algorithms

More information

Using a genetic algorithm for editing k-nearest neighbor classifiers

Using a genetic algorithm for editing k-nearest neighbor classifiers Using a genetic algorithm for editing k-nearest neighbor classifiers R. Gil-Pita 1 and X. Yao 23 1 Teoría de la Señal y Comunicaciones, Universidad de Alcalá, Madrid (SPAIN) 2 Computer Sciences Department,

More information

CHAPTER 5 OPTIMAL CLUSTER-BASED RETRIEVAL

CHAPTER 5 OPTIMAL CLUSTER-BASED RETRIEVAL 85 CHAPTER 5 OPTIMAL CLUSTER-BASED RETRIEVAL 5.1 INTRODUCTION Document clustering can be applied to improve the retrieval process. Fast and high quality document clustering algorithms play an important

More information

Application of Improved Discrete Particle Swarm Optimization in Logistics Distribution Routing Problem

Application of Improved Discrete Particle Swarm Optimization in Logistics Distribution Routing Problem Available online at www.sciencedirect.com Procedia Engineering 15 (2011) 3673 3677 Advanced in Control Engineeringand Information Science Application of Improved Discrete Particle Swarm Optimization in

More information

CHAPTER 6 ORTHOGONAL PARTICLE SWARM OPTIMIZATION

CHAPTER 6 ORTHOGONAL PARTICLE SWARM OPTIMIZATION 131 CHAPTER 6 ORTHOGONAL PARTICLE SWARM OPTIMIZATION 6.1 INTRODUCTION The Orthogonal arrays are helpful in guiding the heuristic algorithms to obtain a good solution when applied to NP-hard problems. This

More information

Solving the Hard Knapsack Problems with a Binary Particle Swarm Approach

Solving the Hard Knapsack Problems with a Binary Particle Swarm Approach Solving the Hard Knapsack Problems with a Binary Particle Swarm Approach Bin Ye 1, Jun Sun 1, and Wen-Bo Xu 1 School of Information Technology, Southern Yangtze University, No.1800, Lihu Dadao, Wuxi, Jiangsu

More information

International Journal of Digital Application & Contemporary research Website: (Volume 1, Issue 7, February 2013)

International Journal of Digital Application & Contemporary research Website:   (Volume 1, Issue 7, February 2013) Performance Analysis of GA and PSO over Economic Load Dispatch Problem Sakshi Rajpoot sakshirajpoot1988@gmail.com Dr. Sandeep Bhongade sandeepbhongade@rediffmail.com Abstract Economic Load dispatch problem

More information

SIMULTANEOUS COMPUTATION OF MODEL ORDER AND PARAMETER ESTIMATION FOR ARX MODEL BASED ON MULTI- SWARM PARTICLE SWARM OPTIMIZATION

SIMULTANEOUS COMPUTATION OF MODEL ORDER AND PARAMETER ESTIMATION FOR ARX MODEL BASED ON MULTI- SWARM PARTICLE SWARM OPTIMIZATION SIMULTANEOUS COMPUTATION OF MODEL ORDER AND PARAMETER ESTIMATION FOR ARX MODEL BASED ON MULTI- SWARM PARTICLE SWARM OPTIMIZATION Kamil Zakwan Mohd Azmi, Zuwairie Ibrahim and Dwi Pebrianti Faculty of Electrical

More information

Particle Swarm Optimization

Particle Swarm Optimization Dario Schor, M.Sc., EIT schor@ieee.org Space Systems Department Magellan Aerospace Winnipeg Winnipeg, Manitoba 1 of 34 Optimization Techniques Motivation Optimization: Where, min x F(x), subject to g(x)

More information

Particle Swarm Optimization Based Approach for Location Area Planning in Cellular Networks

Particle Swarm Optimization Based Approach for Location Area Planning in Cellular Networks International Journal of Intelligent Systems and Applications in Engineering Advanced Technology and Science ISSN:2147-67992147-6799 www.atscience.org/ijisae Original Research Paper Particle Swarm Optimization

More information

A Hybrid Fireworks Optimization Method with Differential Evolution Operators

A Hybrid Fireworks Optimization Method with Differential Evolution Operators A Fireworks Optimization Method with Differential Evolution Operators YuJun Zheng a,, XinLi Xu a, HaiFeng Ling b a College of Computer Science & Technology, Zhejiang University of Technology, Hangzhou,

More information

Published by: PIONEER RESEARCH & DEVELOPMENT GROUP ( 1

Published by: PIONEER RESEARCH & DEVELOPMENT GROUP (  1 Cluster Based Speed and Effective Feature Extraction for Efficient Search Engine Manjuparkavi A 1, Arokiamuthu M 2 1 PG Scholar, Computer Science, Dr. Pauls Engineering College, Villupuram, India 2 Assistant

More information

A Classifier with the Function-based Decision Tree

A Classifier with the Function-based Decision Tree A Classifier with the Function-based Decision Tree Been-Chian Chien and Jung-Yi Lin Institute of Information Engineering I-Shou University, Kaohsiung 84008, Taiwan, R.O.C E-mail: cbc@isu.edu.tw, m893310m@isu.edu.tw

More information

Feature Selection Using Modified-MCA Based Scoring Metric for Classification

Feature Selection Using Modified-MCA Based Scoring Metric for Classification 2011 International Conference on Information Communication and Management IPCSIT vol.16 (2011) (2011) IACSIT Press, Singapore Feature Selection Using Modified-MCA Based Scoring Metric for Classification

More information

Clustering of datasets using PSO-K-Means and PCA-K-means

Clustering of datasets using PSO-K-Means and PCA-K-means Clustering of datasets using PSO-K-Means and PCA-K-means Anusuya Venkatesan Manonmaniam Sundaranar University Tirunelveli- 60501, India anusuya_s@yahoo.com Latha Parthiban Computer Science Engineering

More information

An Island Based Hybrid Evolutionary Algorithm for Optimization

An Island Based Hybrid Evolutionary Algorithm for Optimization An Island Based Hybrid Evolutionary Algorithm for Optimization Changhe Li and Shengxiang Yang Department of Computer Science, University of Leicester University Road, Leicester LE1 7RH, UK {cl160,s.yang}@mcs.le.ac.uk

More information

An Effective Performance of Feature Selection with Classification of Data Mining Using SVM Algorithm

An Effective Performance of Feature Selection with Classification of Data Mining Using SVM Algorithm Proceedings of the National Conference on Recent Trends in Mathematical Computing NCRTMC 13 427 An Effective Performance of Feature Selection with Classification of Data Mining Using SVM Algorithm A.Veeraswamy

More information

Small World Network Based Dynamic Topology for Particle Swarm Optimization

Small World Network Based Dynamic Topology for Particle Swarm Optimization Small World Network Based Dynamic Topology for Particle Swarm Optimization Qingxue Liu 1,2, Barend Jacobus van Wyk 1 1 Department of Electrical Engineering Tshwane University of Technology Pretoria, South

More information

PARALLEL PARTICLE SWARM OPTIMIZATION IN DATA CLUSTERING

PARALLEL PARTICLE SWARM OPTIMIZATION IN DATA CLUSTERING PARALLEL PARTICLE SWARM OPTIMIZATION IN DATA CLUSTERING YASIN ORTAKCI Karabuk University, Computer Engineering Department, Karabuk, Turkey E-mail: yasinortakci@karabuk.edu.tr Abstract Particle Swarm Optimization

More information

RECORD-TO-RECORD TRAVEL ALGORITHM FOR ATTRIBUTE REDUCTION IN ROUGH SET THEORY

RECORD-TO-RECORD TRAVEL ALGORITHM FOR ATTRIBUTE REDUCTION IN ROUGH SET THEORY RECORD-TO-RECORD TRAVEL ALGORITHM FOR ATTRIBUTE REDUCTION IN ROUGH SET THEORY MAJDI MAFARJA 1,2, SALWANI ABDULLAH 1 1 Data Mining and Optimization Research Group (DMO), Center for Artificial Intelligence

More information

International Journal of Advance Research in Computer Science and Management Studies

International Journal of Advance Research in Computer Science and Management Studies Volume 3, Issue 11, November 2015 ISSN: 2321 7782 (Online) International Journal of Advance Research in Computer Science and Management Studies Research Article / Survey Paper / Case Study Available online

More information

Open Access Research on the Prediction Model of Material Cost Based on Data Mining

Open Access Research on the Prediction Model of Material Cost Based on Data Mining Send Orders for Reprints to reprints@benthamscience.ae 1062 The Open Mechanical Engineering Journal, 2015, 9, 1062-1066 Open Access Research on the Prediction Model of Material Cost Based on Data Mining

More information

Small World Particle Swarm Optimizer for Global Optimization Problems

Small World Particle Swarm Optimizer for Global Optimization Problems Small World Particle Swarm Optimizer for Global Optimization Problems Megha Vora and T.T. Mirnalinee Department of Computer Science and Engineering S.S.N College of Engineering, Anna University, Chennai,

More information

A Novel Social Network Structural Balance Based on the Particle Swarm Optimization Algorithm

A Novel Social Network Structural Balance Based on the Particle Swarm Optimization Algorithm BULGARIAN ACADEMY OF SCIENCES CYBERNETICS AND INFORMATION TECHNOLOGIES Volume 15, No 2 Sofia 2015 Print ISSN: 1311-9702; Online ISSN: 1314-4081 DOI: 10.1515/cait-2015-0026 A Novel Social Network Structural

More information

An Intelligent Mesh Based Multicast Routing Algorithm for MANETs using Particle Swarm Optimization

An Intelligent Mesh Based Multicast Routing Algorithm for MANETs using Particle Swarm Optimization 214 An Intelligent Mesh Based Multicast Routing Algorithm for MANETs using Particle Swarm Optimization E. Baburaj 1, and V. Vasudevan 2 1. Research Scholar, Anna University 2. Professor, Department of

More information

Constrained Single-Objective Optimization Using Particle Swarm Optimization

Constrained Single-Objective Optimization Using Particle Swarm Optimization 2006 IEEE Congress on Evolutionary Computation Sheraton Vancouver Wall Centre Hotel, Vancouver, BC, Canada July 16-21, 2006 Constrained Single-Objective Optimization Using Particle Swarm Optimization Karin

More information

Feature Selection based on Rough Sets and Particle Swarm Optimization

Feature Selection based on Rough Sets and Particle Swarm Optimization Feature Selection based on Rough Sets and Particle Swarm Optimization Xiangyang Wang a,*, Jie Yang a, Xiaolong Teng a, Weijun Xia b, Richard Jensen c a Institute of Image Processing and Pattern Recognition,

More information

Improving Results and Performance of Collaborative Filtering-based Recommender Systems using Cuckoo Optimization Algorithm

Improving Results and Performance of Collaborative Filtering-based Recommender Systems using Cuckoo Optimization Algorithm Improving Results and Performance of Collaborative Filtering-based Recommender Systems using Cuckoo Optimization Algorithm Majid Hatami Faculty of Electrical and Computer Engineering University of Tabriz,

More information

SSV Criterion Based Discretization for Naive Bayes Classifiers

SSV Criterion Based Discretization for Naive Bayes Classifiers SSV Criterion Based Discretization for Naive Bayes Classifiers Krzysztof Grąbczewski kgrabcze@phys.uni.torun.pl Department of Informatics, Nicolaus Copernicus University, ul. Grudziądzka 5, 87-100 Toruń,

More information

Feature Selection for Multi-Class Imbalanced Data Sets Based on Genetic Algorithm

Feature Selection for Multi-Class Imbalanced Data Sets Based on Genetic Algorithm Ann. Data. Sci. (2015) 2(3):293 300 DOI 10.1007/s40745-015-0060-x Feature Selection for Multi-Class Imbalanced Data Sets Based on Genetic Algorithm Li-min Du 1,2 Yang Xu 1 Hua Zhu 1 Received: 30 November

More information

CHAPTER 2 CONVENTIONAL AND NON-CONVENTIONAL TECHNIQUES TO SOLVE ORPD PROBLEM

CHAPTER 2 CONVENTIONAL AND NON-CONVENTIONAL TECHNIQUES TO SOLVE ORPD PROBLEM 20 CHAPTER 2 CONVENTIONAL AND NON-CONVENTIONAL TECHNIQUES TO SOLVE ORPD PROBLEM 2.1 CLASSIFICATION OF CONVENTIONAL TECHNIQUES Classical optimization methods can be classified into two distinct groups:

More information

LEARNING WEIGHTS OF FUZZY RULES BY USING GRAVITATIONAL SEARCH ALGORITHM

LEARNING WEIGHTS OF FUZZY RULES BY USING GRAVITATIONAL SEARCH ALGORITHM International Journal of Innovative Computing, Information and Control ICIC International c 2013 ISSN 1349-4198 Volume 9, Number 4, April 2013 pp. 1593 1601 LEARNING WEIGHTS OF FUZZY RULES BY USING GRAVITATIONAL

More information

Population Structure and Particle Swarm Performance

Population Structure and Particle Swarm Performance Population Structure and Particle Swarm Performance James Kennedy Bureau of Labor Statistics Washington, DC Kennedy_Jim@bls.gov Rui Mendes Universidade do Minho Braga, Portugal rui@omega.di.uminho.pt Abstract:

More information

Variable Neighborhood Particle Swarm Optimization for Multi-objective Flexible Job-Shop Scheduling Problems

Variable Neighborhood Particle Swarm Optimization for Multi-objective Flexible Job-Shop Scheduling Problems Variable Neighborhood Particle Swarm Optimization for Multi-objective Flexible Job-Shop Scheduling Problems Hongbo Liu 1,2,AjithAbraham 3,1, Okkyung Choi 3,4, and Seong Hwan Moon 4 1 School of Computer

More information

A METHOD FOR DIAGNOSIS OF LARGE AIRCRAFT ENGINE FAULT BASED ON PARTICLE SWARM ROUGH SET REDUCTION

A METHOD FOR DIAGNOSIS OF LARGE AIRCRAFT ENGINE FAULT BASED ON PARTICLE SWARM ROUGH SET REDUCTION A METHOD FOR DIAGNOSIS OF LARGE AIRCRAFT ENGINE FAULT BASED ON PARTICLE SWARM ROUGH SET REDUCTION ZHUANG WU Information College, Capital University of Economics and Business, Beijing 100070, China ABSTRACT

More information

INTEGRATION OF INVENTORY CONTROL AND SCHEDULING USING BINARY PARTICLE SWARM OPTIMIZATION ALGORITHM

INTEGRATION OF INVENTORY CONTROL AND SCHEDULING USING BINARY PARTICLE SWARM OPTIMIZATION ALGORITHM INTEGRATION OF INVENTORY CONTROL AND SCHEDULING USING BINARY PARTICLE SWARM OPTIMIZATION ALGORITHM Manash Dey Assistant Professor, Mechanical Engineering Department, JIMS EMTC Greater Noida (India) ABSTRACT

More information

Feature Selection in Knowledge Discovery

Feature Selection in Knowledge Discovery Feature Selection in Knowledge Discovery Susana Vieira Technical University of Lisbon, Instituto Superior Técnico Department of Mechanical Engineering, Center of Intelligent Systems, IDMEC-LAETA Av. Rovisco

More information

Scheme of Big-Data Supported Interactive Evolutionary Computation

Scheme of Big-Data Supported Interactive Evolutionary Computation 2017 2nd International Conference on Information Technology and Management Engineering (ITME 2017) ISBN: 978-1-60595-415-8 Scheme of Big-Data Supported Interactive Evolutionary Computation Guo-sheng HAO

More information

A Multiobjective Memetic Algorithm Based on Particle Swarm Optimization

A Multiobjective Memetic Algorithm Based on Particle Swarm Optimization A Multiobjective Memetic Algorithm Based on Particle Swarm Optimization Dr. Liu Dasheng James Cook University, Singapore / 48 Outline of Talk. Particle Swam Optimization 2. Multiobjective Particle Swarm

More information