Effect of the PSO Topologies on the Performance of the PSO-ELM

Size: px
Start display at page:

Download "Effect of the PSO Topologies on the Performance of the PSO-ELM"

Transcription

1 2012 Brazilian Symposium on Neural Networks Effect of the PSO Topologies on the Performance of the PSO-ELM Elliackin M. N. Figueiredo and Teresa B. Ludermir Center of Informatics Federal University of Pernambuco Recife, Brazil Abstract In recent years, the Extreme Learning Machine (ELM) has been hybridized with the Particle Swarm Optimization (PSO). This hybridization is named PSO-ELM. In most of these hybridizations, the PSO uses the global topology. However, other topologies were designed to improve the performance of the PSO. The performance of PSO depends on its topology, and there is not a best topology for all problems. Thus, in this paper, we investigate the effect of the PSO topology on performance of the PSO-ELM. In our study, we investigate five topologies: Global, Local, Von Neumann, Wheel and Four Clusters. The results showed that the Global topology can be more promising than all other topologies. Keywords-Extreme Learning Machine; Particle Swarm Optimization; topology I. INTRODUCTION Extreme Learning Machine (ELM) is an efficient learning algorithm for single-hidden layer feedforward neural network (SLFN) [1] created in order to overcome the drawbacks of gradient-based methods. In the last years, the Particle Swarm Optimization (PSO) has been combined with ELM in order to improve the generalization capacity of the SLFNs. In 2006, Xu and Shu presented an evolutionary ELM based on PSO named PSO-ELM and applied this algorithm in a prediction task [2]. In this work, the PSO was used to optimize the input weights and hidden biases of the SLFN. The objective function used to evaluate each particle is the root mean squared error (RMSE) on the validation set. In 2011, Han et al. [3] proposed an Improved Extreme Learning Machine named IPSO-ELM that uses an improved PSO with adaptive inertia to select the input weights and hidden biases of the SLFN. However, unlike the PSO-ELM, the IPSO-ELM optimizes the input weights and hidden biases according to the RMSE on the validation set and the norm of the output weights. Thus, this algorithm can obtain good performance with more compact and well-conditioned SLFN than other approaches of ELMs. In the same year, Sundararajan et al. [4] propose a combination of Integer Coded Genetic Algorithm (ICGA), PSO and ELM used for cancer classification. In this work, the ICGA is used to features selection while the PSO-ELM classifier is used to find the optimal input weights such that the ELM classifier can discriminate the cancer classes efficiently. This PSO- ELM was proposed in order to handle with sparse and highly imbalanced data set. In all these works, the social topology used in the PSO is the global. In this topology, all particles are fulled connected. That is, each particle can communicate with every other particle. In this case, each particle is attracted towards the best solution found by the entire swarm. Besides of the global topology, other topologies were designed to improve the performance of the PSO. As observed in [5], the performance of the PSO depends strongly of its topology and there is no outright best topology for all problems. Thus, it is important to investigate the effect of the PSO topologies on the performance of the PSO-ELM in the training of SLFNs. Studies on the effect of the PSO topology on the performance of the PSO in training of the neural networks have been carried out in the past [6]. More recently, in 2010, van Wyk and Engelbrecht investigated the overffiting in neural networks trained with the PSO based on three topologies: Global, Local and Von Neumann [7]. Thus, the objective of this paper is to carry out a similar study to those in context of the PSO-ELM. That is, our objective is to investigate the effect of different PSO topologies on the performance of the PSO-ELM classifier. In this paper, we studied five topologies: Global, Local, Von Neumann, Wheel and Four Clusters. The experimental results showed that the Global topology can be more promising than all other topologies. The rest of this paper is organized as follows. Section II presents the ELM algorithm. Section III presents the PSO algorithm and the topologies used in this work. The experimental arrangement is detailed in Section IV. Section V discusses the experimental results. The paper is concluded in Section VI along with the future work. II. EXTREME LEARNING MACHINE Extreme Learning Machine (ELM) is a simple learning algorithm for single-hidden layer feedforward neural networks (SLFNs) proposed by Huang et al. [1]. This learning algorithm not only learns much faster with higher generalization performance than the traditional gradient-based learning algorithms but also avoids many difficulties faced by gradient-based learning methods such as stopping criteria, learning rate, learning epochs, and local minima [8]. ELM works as following. For N distinct training samples (x i, t i ) R n XR m, where x i = [x i1,x i2,...,x in ] T and /12 $ IEEE DOI /SBRN

2 t i =[t i1,t i2,...,t im ] T, the SLFNs with K hidden nodes and an activation function g(x) are mathematically modeled as Hβ = T (1) where H = {h ij } (i =1,...,N and j =1,...,K)isthe hidden layer output matrix, h ij = g(wj T x i + b j ) denotes the output of jth hidden node with respect to x i ; w j = [w j1,w j2,...,w jn ] T denotes the weight vector connecting the jth hidden node and the input nodes, and b j is the bias (threshold) of the jth hidden node; β =[β 1, β 2,...β K ] T is the matrix of output weights and β j =[β j1,β j2,...β jm ] T (j =1,...,K) is the the weight vector connecting the jth hidden node and the output nodes; T =[t 1, t 2,, t N ] T denotes the matrix of targets. Thus, initially the ELM generates the input weights and hidden biases randomly. Next, the determination of the output weights, linking the hidden layer to the output layer, consists in finding simply the least-square solution to the linear system given by the Eq. 1. The minimum norm leastsquare (LS) solution to the linear system given by Eq. 1 is ˆβ = H T (2) where H is the Moore-Penrose (MP) generalized inverse of matrix H. III. PARTICLE SWARM OPTIMIZATION This section presents the basic algorithm of Particle Swarm Optimization and the information sharing topologies. A. Basic Algorithm The Particle Swarm Optimization (PSO) is a populationbased stochastic optimization technique developed by Kennedy and Eberhart in 1995, and attempts to model the flight pattern of a flock of birds [9]. In PSO, each particle in the swarm represents a candidate solution for the objective function f(x). Thus, the current position of the ith particle at iteration t is denoted by x i (t). In each iteration of the PSO, each particle moves through the search space with a velocity v i (t) calculated as follows: v i (t +1)=wv i (t)+c 1 r 1 [y i (t) x i (t)] + c 2 r 2 [ŷ(t) x i (t)], where w is the inertia weight, y i (t) is the personal best position of the particle i at iteration t and ŷ(t) is the global best position of the swarm at iteration t. The personal best position is called pbest and it represents the best position found by the particle during the search process until the iteration t. The global best position is called gbest and it constitutes the best position found by the entire swarm until the iteration t. The quality (fitness) of a solution is (3) measured by objective function, f(x i (t)), calculated from the current position of the particle i. The terms c 1 and c 2 are acceleration coefficients. The terms r 1 and r 2 are vectors whose components r 1j and r 2j respectively are random numbers sampled from an uniform distribution U(0, 1). The velocity is limited to the range [v min, v max ]. After updating velocity, the new position of the particle i at iteration t +1 is calculated using Eq. 4 x i (t +1)=x i (t)+v i (t +1). (4) The Algorithm 1 summarizes the PSO. Algorithm 1 PSO Algorithm 1: Initialize the swarm S; 2: while stopping condition is false do 3: for all particle i of the swarm do 4: Calculate the fitness f(x i (t)); 5: Set the personal best position y i (t); 6: end for 7: Set the global best position ŷ(t) of the swarm; 8: for all particle i of the swarm do 9: Update the velocity v i (t); 10: Update the position x i (t); 11: end for 12: end while B. Topologies The PSO described above is known as gbest PSO. In this PSO, each particle is connected to every other particle in the swarm in a type of topology referred to as Global topology. Different topologies have been developed for PSO and empirically studied, each affecting the performance of the PSO in a potentially drastic way [10]. In this paper, five topologies well known in the literature have been selected for comparison: Global: In this topology, the swarm is fully connected and each particle can therefore communicate with every other particle as illustrated in Fig. 1a. Thus, each particle is attracted towards the best solution found by the entire swarm. This topology has been shown an information flow between particles and convergence faster than other topologies, but with a susceptibility to be trapped in local minima. Local: In this topology, each particle communicates with its n N immediate neighbors. Thus, instead of a global best particle, each particle selects the best particle in its neighborhood. When n N = 2, each particle within of the swarm communicates with its immediately adjacent neighbors as illustrated in Fig. 1b. In this paper, n N =2was used. Von Neumann: In this topology, each particle is connected to its left, right, top and bottom neighbors on a 179

3 percentage of correct classifications produced by the trained SLFNs on the testing set. Thus, the TA is calculated using Eq. 5 (a) Global (b) Local (c) Von Neumann (d) Wheel Figure 1. (e) Four Cluster Topologies of the PSO two dimensional lattice. This topology has been shown in a number of empirical studies to outperform other topologies in a large number of problems [10]. Figure 1c illustrates this topology. Wheel: In this social topology, the particles in a neighborhood are isolated from one another. One particle serves as the focal point, and all information flow occurs through the focal particle as illustrated in Fig. 1d. In this work, the focal particle was chosen randomly. Four clusters: In this topology, four clusters of particles are formed with two connections between clusters and each cluster is a subswarm fully connected as illustrated in Fig. 1e. It is worth mentioning that the neighborhood of the particles in all topologies is based on the their indexes. For a more detailed description of these topologies, the reader may consult [5]. IV. EXPERIMENTAL ARRANGEMENT In this section, we give the details of the experimental arrangement. A. PSO-ELM Description In this work, we used the PSO to select the input weights and hidden biases of the SLFN, and the MP generalized inverse is used to analytically calculate the output weights. In the PSO used, the inertia varies linearly with time like in [3]. The fitness of each particle is adopted as the root mean squared error (RMSE) on the validation set. We used the RMSE because it is a continuous metric. Thus, the search process can be conducted in the smooth way. B. Metrics Performance In this paper, two metrics are used to evaluate the performance of PSO-ELM: root mean squared error (RMSE) over the validation set referred to as RMSE V and the test accuracy denoted by TA. The test accuracy refers to the TA = 100 n c n T (5) where n T is the size of the testing set and n c is the number of correct classifications. The RMSE over the validation set of size P is calculated using Eq. 6: RMSE V = P p=1 K k=1 (t k,p o k,p ) 2 P K where K is the number of output units in the SLFN, t k,p is the target to the pattern p in the output k, and o k,p is the output obtained by the network to the pattern p in the output k. Other metric used in this work is the global swarm diversity D S [11] that is calculated as follows: D S = 1 S (6) S I (x ik x k ) 2 (7) i=1 k=1 where S is the swarm size, I is the dimensionality of the search space, x i and x are particle position i and the swarm center respectively. This metric is used to measure the convergence of the PSO. In this metric, a small value indicates swarm convergence around the swarm center while a large value indicates a higher dispersion of particles away from the center. C. Problem Sets In our experiments, 5 benchmark classification problems obtained from UCI Machine Learning Repository [12] were used. These problems presents different degrees of difficulties and different number of classes. The missing values were handled using the value most frequently of each attribute. The problems and chosen sizes of the hidden layer are given in Tab. I. The sizes of the hidden layer used were the same that [13] In our experiments, all attributes have been normalized into the range [0,1], while the outputs (targets) have been normalize into [-1,1]. The ELM activation function used was the sigmoid function. Each dataset was divided in training, validation and testing sets, as specified in Tab. II. For all topologies, 30 independent executions were done for each dataset. The training, validation and testing sets were randomly generated at each trial of simulations. All topologies were executed on the same sets generated. 180

4 Table I DATA SETS USED IN THE EXPERIMENTS ALONG WITH THE HIDDEN LAYER SIZES OF THE SLFNS. Problem Attributes Classes Hidden Nodes Cancer Pima Indians Diabetes Glass Identification Heart Iris RMSE RMSE Evolution for Cancer Global Local VonNeumann Wheel 4Clusters Table II PARTITION OF THE DATA SETS USED IN THE EXPERIMENTS Problem Training Validation Testing Total Cancer Pima Indians Diabetes Glass Identification Heart Iris Iteration Figure 2. RMSE V over time on the Cancer dataset. Graph shows mean over 30 samples. D. PSO Configuration The following PSO configuration was used for the experiment: the c 1 and c 2 control parameters were both set to 2.0. We used an adaptive inertia w, where the initial inertia is 0.9 and the end inertia is 0.4. This same scheme was used in the IPSO-ELM. The swarm size used was 20. We use a few particles because we wanted to observe the effect of topology on information sharing. If we use a very large number of particles as in [2] and [3], the effect of the topology on the communication of the particles could be hidden. It is worth to say that we did not try to find out the better number of particles for each topology. Thus, the same number of particles was used for all. The PSO was executed for 1000 iterations. We use many iterations because we wanted to provide enough time for the exchange of information between the particles for all topologies. All components in the particle are randomly initialized within the range of [-1,1]. In this work, we used the reflexive approach as was done by [2], i.e, the out-bounded particles in a given dimension will inverse their directions in this dimension and leave the boundary keeping the same velocity. We used v max =1and v min = 1 for all dimensions of the search space. V. RESULTS AND DISCUSSION In this section, we present the results obtained. Figure 2 shows the evolution of the RMSE V along of the iterations of the five PSO topologies on the Cancer dataset. The graph corresponds to the mean over 30 samples. As can be seen from the graph, all topologies showed the same behavior until approximately the 400th iteration. After this iteration, the decrease in the RMSE V occurred differently for each topology. The Wheel topology, for example, presented a decrease of the RMSE V slower than all other topologies empirically. This result is consistent with the literature which highlights the slow sharing of information in this topology. The same result was also observed in a similar way for all data sets evaluated. For the other topologies, their behaviors varied over the data set. However, the following data was observed: the topologies with the lowest RMSE V in the last iteration were the Global and the Von Neumann topologies for all datasets evaluated. For example, for Cancer and Heart, the Global topology presented a better RMSE V than all other topologies, while for Glass, Iris and Diabetes, the Von Neumann topology showed a better RMSE V than all other topologies. The Tables III, IV, V, VI and VII show the results for the data sets Cancer, Diabetes, Glass, Heart and Iris, respectively. The value in bold for the RMSE V represents the lowest value obtained for the RMSE V and the value in bold for D S represents the lowest diversity among the topologies. We perform the one-way analysis of variance (ANOVA) to assess whether the performs of the topologies are statistically different or not 1. We used this technique because the RMSE V samples came from a normal distribution for all topologies. Figure 3 shows the confidence intervals with 95% of confidence of the topologies for the RMSE V for the Cancer dataset 2. As can be seen, the topologies Global, Local, Von Neumann, and Four Clusters have statistically equivalent performance, while the Wheel topology has a 1 We used the Levene s test with a significance level equal to 5% to test if the samples of the topologies had equal variances. 2 To generate this confidence intervals, we performed a multiple comparison test using the Matlab function multcompare 181

5 Table III RESULTS FOR CANCER. MEAN AND STANDARD DEVIATION (IN Global Local Von Neumann Wheel Four Clusters Global (0.0440) (1.33) (0.1833) Local (0.0384) (1.44) (0.1709) Von Neumann (0.0417) (1.40) (0.4683) Wheel (0.0345) (1.52) (0.4609) Four Clusters (0.0461) (1.41) (0.4516) Figure 3. dataset. Confidence intervals with 5% of confidence for the Cancer Table IV RESULTS FOR DIABETES. MEAN AND STANDARD DEVIATION (IN Global (0.0263) (2.40) (0.5129) Local (0.0235) (2.25) (0.1816) Von Neumann (0.0230) (2.39) (0.3877) Wheel (0.0227) (2.11) (0.5493) Four Clusters (0.0240) (2.18) (0.4128) 0.57 Figure 4. Global Local Von Neumann Wheel Four Clusters Confidence intervals 5% of confidence for the Heart dataset. significantly lower performace than these topologies. The Diabetes and Glass datasets had the same result and their graphs are not shown here due to obvious space limitations. For the Iris dataset, all topologies present the same performance in statistical terms. Figure 4 shows the confidence intervals for the Heart dataset. As we can be see, the topologies Global, Von Neumann and Four Clusters have statistically equivalent performance. The topology Wheel has a performance significantily different from all other topologies. Thus, in general, the topologies Global, Von Neumann and Four Clusters have the same performance statistically. An explanation for this is that these topologies are more connected than all other topologies and thus they have similar behavior. In relation to the Test Accuracy, all topologies showed performance statistically equivalent according to the Kruskal-Wallis test with 5% of significance 3. In terms of diversity, the Global topology achieved a lower result than all other topologies empirically over all datasets. It is important to note that all topologies had low values of diversity with small standard deviations over all datasets, indicating convergence of the PSO for all topologies. VI. CONCLUSION In this paper, we investigate the effect of five topologies on the performance of a PSO-ELM. The results showed that in all data sets evaluated, the Global and Von Neumann topologies achieved a lower RMSE V, in an empirical analysis, than all other topologies. However, the topologies Global, Von Neumann and Four Cluster were equivalents in the statistical analysis on all datasets. Analyzing the convergence, we see that the diversity of the Global topology in the last iteration is lower than all other topologies on all datasets. This means that the Global topology has a good capacity for exploitation, which can make it promising in finding good solutions for the PSO-ELM. Interestingly, all 3 We used the Kruskal-Wallis test because the test accuracies did not follow a normal distribution on all the data set. 182

6 Table V RESULTS FOR GLASS. MEAN AND STANDARD DEVIATION (IN Global (0.0311) (6.80) (0.4405) Local (0.0272) (6.12) (0.1675) Von Neumann (0.0330) (7.22) (0.3916) Wheel (0.0262) (7.11) (0.4367) Four Clusters (0.0330) (7.06) (0.5615) Table VI RESULTS FOR HEART. MEAN AND STANDARD DEVIATION (IN REFERENCES [1] G.-B. Huang, Q.-Y. Zhu, and C.-K. Siew, Extreme learning machine: a new learning scheme of feedforward neural networks, in Proceedings of the IEEE International Joint Conference on Neural Networks, vol. 2, 2004, pp [2] Y. Xu and Y. Shu, Evolutionary extreme learning machine - Based on particle swarm optimization, ser. Lecture Notes in Computer Science, 2006, vol [3] F. Han, H.. Yao, and Q.. Ling, An improved extreme learning machine based on particle swarm optimization, ser. Lecture Notes in Computer Science, 2011, vol [4] S. Saraswathi, S. Sundaram, N. Sundararajan, M. Zimmermann, and M. Nilsen-Hamilton, Icga-pso-elm approach for accurate multiclass cancer classification resulting in reduced gene sets in which genes encoding secreted proteins are highly represented, IEEE/ACM Transactions on Computational Biology and Bioinformatics, vol. 8, no. 2, pp , Global (0.0491) (4.73) (0.2889) Local (0.0457) (4.51) (0.1294) Von Neumann (0.0490) (4.74) (0.5204) Wheel (0.0454) (4.97) (0.4675) Four Clusters (0.0451) (4.43) (0.4917) Table VII RESULTS FOR IRIS. MEAN AND STANDARD DEVIATION (IN Global (0.0457) (2.69) (0.2877) Local (0.0449) (3.14) (0.1376) Von Neumann (0.0485) (2.84) (0.2384) Wheel (0.0443) (2.87) (0.5140) Four Clusters (0.0405) (2.98) (0.3020) topologies showed the same test accuracy in statistical terms. A hypothesis can explain this fact is that all topologies get stuck in a same local minimum. This point needs to be further studied and it is a future work. Another future work is to use dynamic topologies as the Dynamic Clan topology [14]. These topologies provide a good trade-off between exploration and exploitation. [5] A. P. Engelbrecht, Computational Intelligence: An Introduction. John Wiley & Sons Ltd, [6] R. Mendes, P. Cortez, M. Rocha, and J. Neves, Particle swarms for feedforward neural network training, in Proceedings of the International Joint Conference on Neural Networks, IJCNN 02, vol. 2, 2002, pp [7] A. van Wyk and A. Engelbrecht, Overfitting by pso trained feedforward neural networks, in IEEE Congress on Evolutionary Computation (CEC), 2010, pp [8] J. Cao, Z. Lin, and G. bin Huang, Composite function wavelet neural networks with extreme learning machine, Neurocomputing, vol. 73, no. 79, pp , [9] J. Kennedy and R. Eberhart, Particle swarm optimization, in Proceedings of the IEEE International Conference on Neural Networks, vol. 4, 1995, pp [10] J. Kennedy and R. Mendes, Population structure and particle swarm performance, in Proceedings of the 2002 Congress on Evolutionary Computation. CEC 02, vol. 2, 2002, pp [11] O. Olorunda and A. Engelbrecht, Measuring exploration/exploitation in particle swarms using swarm diversity, in IEEE Congress on Evolutionary Computation, CEC 2008, june 2008, pp [12] A. Frank and A. Asuncion, UCI machine learning repository, [Online]. Available: (Accessed: 12 July 2012) [13] D. N. G. Silva, L. D. S. Pacifico, and T. B. Ludermir, An evolutionary extreme learning machine based on group search optimization, in IEEE Congress on Evolutionary Computation (CEC), [14] C. Bastos-Filho, D. Carvalho, E. Figueiredo, and P. de Miranda, Dynamic clan particle swarm optimization, in Ninth International Conference on Intelligent Systems Design and Applications, ISDA 09, 2009, pp

Modified Particle Swarm Optimization

Modified Particle Swarm Optimization Modified Particle Swarm Optimization Swati Agrawal 1, R.P. Shimpi 2 1 Aerospace Engineering Department, IIT Bombay, Mumbai, India, swati.agrawal@iitb.ac.in 2 Aerospace Engineering Department, IIT Bombay,

More information

An Efficient Optimization Method for Extreme Learning Machine Using Artificial Bee Colony

An Efficient Optimization Method for Extreme Learning Machine Using Artificial Bee Colony An Efficient Optimization Method for Extreme Learning Machine Using Artificial Bee Colony Chao Ma College of Digital Media, Shenzhen Institute of Information Technology Shenzhen, Guangdong 518172, China

More information

Binary Differential Evolution Strategies

Binary Differential Evolution Strategies Binary Differential Evolution Strategies A.P. Engelbrecht, Member, IEEE G. Pampará Abstract Differential evolution has shown to be a very powerful, yet simple, population-based optimization approach. The

More information

Adaptative Clustering Particle Swarm Optimization

Adaptative Clustering Particle Swarm Optimization Adaptative Clustering Particle Swarm Optimization Salomão S. Madeiro, Carmelo J. A. Bastos-Filho, Member, IEEE, and Fernando B. Lima Neto, Senior Member, IEEE, Elliackin M. N. Figueiredo Abstract The performance

More information

Traffic Signal Control Based On Fuzzy Artificial Neural Networks With Particle Swarm Optimization

Traffic Signal Control Based On Fuzzy Artificial Neural Networks With Particle Swarm Optimization Traffic Signal Control Based On Fuzzy Artificial Neural Networks With Particle Swarm Optimization J.Venkatesh 1, B.Chiranjeevulu 2 1 PG Student, Dept. of ECE, Viswanadha Institute of Technology And Management,

More information

A *69>H>N6 #DJGC6A DG C<>C::G>C<,8>:C8:H /DA 'D 2:6G, ()-"&"3 -"(' ( +-" " " % '.+ % ' -0(+$,

A *69>H>N6 #DJGC6A DG C<>C::G>C<,8>:C8:H /DA 'D 2:6G, ()-&3 -(' ( +-   % '.+ % ' -0(+$, The structure is a very important aspect in neural network design, it is not only impossible to determine an optimal structure for a given problem, it is even impossible to prove that a given structure

More information

PARTICLE SWARM OPTIMIZATION (PSO) [1] is an

PARTICLE SWARM OPTIMIZATION (PSO) [1] is an Proceedings of International Joint Conference on Neural Netorks, Atlanta, Georgia, USA, June -9, 9 Netork-Structured Particle Sarm Optimizer Considering Neighborhood Relationships Haruna Matsushita and

More information

Overlapping Swarm Intelligence for Training Artificial Neural Networks

Overlapping Swarm Intelligence for Training Artificial Neural Networks Overlapping Swarm Intelligence for Training Artificial Neural Networks Karthik Ganesan Pillai Department of Computer Science Montana State University EPS 357, PO Box 173880 Bozeman, MT 59717-3880 k.ganesanpillai@cs.montana.edu

More information

An ELM-based traffic flow prediction method adapted to different data types Wang Xingchao1, a, Hu Jianming2, b, Zhang Yi3 and Wang Zhenyu4

An ELM-based traffic flow prediction method adapted to different data types Wang Xingchao1, a, Hu Jianming2, b, Zhang Yi3 and Wang Zhenyu4 6th International Conference on Information Engineering for Mechanics and Materials (ICIMM 206) An ELM-based traffic flow prediction method adapted to different data types Wang Xingchao, a, Hu Jianming2,

More information

Argha Roy* Dept. of CSE Netaji Subhash Engg. College West Bengal, India.

Argha Roy* Dept. of CSE Netaji Subhash Engg. College West Bengal, India. Volume 3, Issue 3, March 2013 ISSN: 2277 128X International Journal of Advanced Research in Computer Science and Software Engineering Research Paper Available online at: www.ijarcsse.com Training Artificial

More information

Improving Tree-Based Classification Rules Using a Particle Swarm Optimization

Improving Tree-Based Classification Rules Using a Particle Swarm Optimization Improving Tree-Based Classification Rules Using a Particle Swarm Optimization Chi-Hyuck Jun *, Yun-Ju Cho, and Hyeseon Lee Department of Industrial and Management Engineering Pohang University of Science

More information

Cell-to-switch assignment in. cellular networks. barebones particle swarm optimization

Cell-to-switch assignment in. cellular networks. barebones particle swarm optimization Cell-to-switch assignment in cellular networks using barebones particle swarm optimization Sotirios K. Goudos a), Konstantinos B. Baltzis, Christos Bachtsevanidis, and John N. Sahalos RadioCommunications

More information

CHAPTER 6 HYBRID AI BASED IMAGE CLASSIFICATION TECHNIQUES

CHAPTER 6 HYBRID AI BASED IMAGE CLASSIFICATION TECHNIQUES CHAPTER 6 HYBRID AI BASED IMAGE CLASSIFICATION TECHNIQUES 6.1 INTRODUCTION The exploration of applications of ANN for image classification has yielded satisfactory results. But, the scope for improving

More information

LECTURE 16: SWARM INTELLIGENCE 2 / PARTICLE SWARM OPTIMIZATION 2

LECTURE 16: SWARM INTELLIGENCE 2 / PARTICLE SWARM OPTIMIZATION 2 15-382 COLLECTIVE INTELLIGENCE - S18 LECTURE 16: SWARM INTELLIGENCE 2 / PARTICLE SWARM OPTIMIZATION 2 INSTRUCTOR: GIANNI A. DI CARO BACKGROUND: REYNOLDS BOIDS Reynolds created a model of coordinated animal

More information

Inertia Weight. v i = ωv i +φ 1 R(0,1)(p i x i )+φ 2 R(0,1)(p g x i ) The new velocity update equation:

Inertia Weight. v i = ωv i +φ 1 R(0,1)(p i x i )+φ 2 R(0,1)(p g x i ) The new velocity update equation: Convergence of PSO The velocity update equation: v i = v i +φ 1 R(0,1)(p i x i )+φ 2 R(0,1)(p g x i ) for some values of φ 1 and φ 2 the velocity grows without bound can bound velocity to range [ V max,v

More information

Fault Diagnosis of Power Transformers using Kernel based Extreme Learning Machine with Particle Swarm Optimization

Fault Diagnosis of Power Transformers using Kernel based Extreme Learning Machine with Particle Swarm Optimization Appl. Math. Inf. Sci. 9, o., 1003-1010 (015) 1003 Applied Mathematics & Information Sciences An International Journal http://dx.doi.org/10.1785/amis/09051 Fault Diagnosis of Power Transformers using Kernel

More information

Small World Particle Swarm Optimizer for Global Optimization Problems

Small World Particle Swarm Optimizer for Global Optimization Problems Small World Particle Swarm Optimizer for Global Optimization Problems Megha Vora and T.T. Mirnalinee Department of Computer Science and Engineering S.S.N College of Engineering, Anna University, Chennai,

More information

PARTICLE SWARM OPTIMIZATION (PSO)

PARTICLE SWARM OPTIMIZATION (PSO) PARTICLE SWARM OPTIMIZATION (PSO) J. Kennedy and R. Eberhart, Particle Swarm Optimization. Proceedings of the Fourth IEEE Int. Conference on Neural Networks, 1995. A population based optimization technique

More information

Hybrid PSO-SA algorithm for training a Neural Network for Classification

Hybrid PSO-SA algorithm for training a Neural Network for Classification Hybrid PSO-SA algorithm for training a Neural Network for Classification Sriram G. Sanjeevi 1, A. Naga Nikhila 2,Thaseem Khan 3 and G. Sumathi 4 1 Associate Professor, Dept. of CSE, National Institute

More information

Improving Generalization of Radial Basis Function Network with Adaptive Multi-Objective Particle Swarm Optimization

Improving Generalization of Radial Basis Function Network with Adaptive Multi-Objective Particle Swarm Optimization Proceedings of the 009 IEEE International Conference on Systems, Man, and Cybernetics San Antonio, TX, USA - October 009 Improving Generalization of Radial Basis Function Network with Adaptive Multi-Obective

More information

An improved PID neural network controller for long time delay systems using particle swarm optimization algorithm

An improved PID neural network controller for long time delay systems using particle swarm optimization algorithm An improved PID neural network controller for long time delay systems using particle swarm optimization algorithm A. Lari, A. Khosravi and A. Alfi Faculty of Electrical and Computer Engineering, Noushirvani

More information

Assessing Particle Swarm Optimizers Using Network Science Metrics

Assessing Particle Swarm Optimizers Using Network Science Metrics Assessing Particle Swarm Optimizers Using Network Science Metrics Marcos A. C. Oliveira-Júnior, Carmelo J. A. Bastos-Filho and Ronaldo Menezes Abstract Particle Swarm Optimizers (PSOs) have been widely

More information

Particle Swarm Optimization

Particle Swarm Optimization Dario Schor, M.Sc., EIT schor@ieee.org Space Systems Department Magellan Aerospace Winnipeg Winnipeg, Manitoba 1 of 34 Optimization Techniques Motivation Optimization: Where, min x F(x), subject to g(x)

More information

Small World Network Based Dynamic Topology for Particle Swarm Optimization

Small World Network Based Dynamic Topology for Particle Swarm Optimization Small World Network Based Dynamic Topology for Particle Swarm Optimization Qingxue Liu 1,2, Barend Jacobus van Wyk 1 1 Department of Electrical Engineering Tshwane University of Technology Pretoria, South

More information

Clustering of datasets using PSO-K-Means and PCA-K-means

Clustering of datasets using PSO-K-Means and PCA-K-means Clustering of datasets using PSO-K-Means and PCA-K-means Anusuya Venkatesan Manonmaniam Sundaranar University Tirunelveli- 60501, India anusuya_s@yahoo.com Latha Parthiban Computer Science Engineering

More information

Handling Multi Objectives of with Multi Objective Dynamic Particle Swarm Optimization

Handling Multi Objectives of with Multi Objective Dynamic Particle Swarm Optimization Handling Multi Objectives of with Multi Objective Dynamic Particle Swarm Optimization Richa Agnihotri #1, Dr. Shikha Agrawal #1, Dr. Rajeev Pandey #1 # Department of Computer Science Engineering, UIT,

More information

Particle Swarm Optimization

Particle Swarm Optimization Particle Swarm Optimization Gonçalo Pereira INESC-ID and Instituto Superior Técnico Porto Salvo, Portugal gpereira@gaips.inesc-id.pt April 15, 2011 1 What is it? Particle Swarm Optimization is an algorithm

More information

Research Article An Improved Kernel Based Extreme Learning Machine for Robot Execution Failures

Research Article An Improved Kernel Based Extreme Learning Machine for Robot Execution Failures e Scientific World Journal, Article ID 906546, 7 pages http://dx.doi.org/10.1155/2014/906546 Research Article An Improved Kernel Based Extreme Learning Machine for Robot Execution Failures Bin Li, 1,2

More information

A Comparative Study of the Application of Swarm Intelligence in Kruppa-Based Camera Auto- Calibration

A Comparative Study of the Application of Swarm Intelligence in Kruppa-Based Camera Auto- Calibration ISSN 2229-5518 56 A Comparative Study of the Application of Swarm Intelligence in Kruppa-Based Camera Auto- Calibration Ahmad Fariz Hasan, Ali Abuassal, Mutaz Khairalla, Amar Faiz Zainal Abidin, Mohd Fairus

More information

Using CODEQ to Train Feed-forward Neural Networks

Using CODEQ to Train Feed-forward Neural Networks Using CODEQ to Train Feed-forward Neural Networks Mahamed G. H. Omran 1 and Faisal al-adwani 2 1 Department of Computer Science, Gulf University for Science and Technology, Kuwait, Kuwait omran.m@gust.edu.kw

More information

Neural Network Weight Selection Using Genetic Algorithms

Neural Network Weight Selection Using Genetic Algorithms Neural Network Weight Selection Using Genetic Algorithms David Montana presented by: Carl Fink, Hongyi Chen, Jack Cheng, Xinglong Li, Bruce Lin, Chongjie Zhang April 12, 2005 1 Neural Networks Neural networks

More information

Reconfiguration Optimization for Loss Reduction in Distribution Networks using Hybrid PSO algorithm and Fuzzy logic

Reconfiguration Optimization for Loss Reduction in Distribution Networks using Hybrid PSO algorithm and Fuzzy logic Bulletin of Environment, Pharmacology and Life Sciences Bull. Env. Pharmacol. Life Sci., Vol 4 [9] August 2015: 115-120 2015 Academy for Environment and Life Sciences, India Online ISSN 2277-1808 Journal

More information

CHAPTER 6 ORTHOGONAL PARTICLE SWARM OPTIMIZATION

CHAPTER 6 ORTHOGONAL PARTICLE SWARM OPTIMIZATION 131 CHAPTER 6 ORTHOGONAL PARTICLE SWARM OPTIMIZATION 6.1 INTRODUCTION The Orthogonal arrays are helpful in guiding the heuristic algorithms to obtain a good solution when applied to NP-hard problems. This

More information

THREE PHASE FAULT DIAGNOSIS BASED ON RBF NEURAL NETWORK OPTIMIZED BY PSO ALGORITHM

THREE PHASE FAULT DIAGNOSIS BASED ON RBF NEURAL NETWORK OPTIMIZED BY PSO ALGORITHM THREE PHASE FAULT DIAGNOSIS BASED ON RBF NEURAL NETWORK OPTIMIZED BY PSO ALGORITHM M. Sivakumar 1 and R. M. S. Parvathi 2 1 Anna University, Tamilnadu, India 2 Sengunthar College of Engineering, Tamilnadu,

More information

Constraints in Particle Swarm Optimization of Hidden Markov Models

Constraints in Particle Swarm Optimization of Hidden Markov Models Constraints in Particle Swarm Optimization of Hidden Markov Models Martin Macaš, Daniel Novák, and Lenka Lhotská Czech Technical University, Faculty of Electrical Engineering, Dep. of Cybernetics, Prague,

More information

Index Terms PSO, parallel computing, clustering, multiprocessor.

Index Terms PSO, parallel computing, clustering, multiprocessor. Parallel Particle Swarm Optimization in Data Clustering Yasin ORTAKCI Karabuk University, Computer Engineering Department, Karabuk, Turkey yasinortakci@karabuk.edu.tr Abstract Particle Swarm Optimization

More information

Three-Dimensional Off-Line Path Planning for Unmanned Aerial Vehicle Using Modified Particle Swarm Optimization

Three-Dimensional Off-Line Path Planning for Unmanned Aerial Vehicle Using Modified Particle Swarm Optimization Three-Dimensional Off-Line Path Planning for Unmanned Aerial Vehicle Using Modified Particle Swarm Optimization Lana Dalawr Jalal Abstract This paper addresses the problem of offline path planning for

More information

Extreme Learning Machine: A New Learning Scheme of Feedforward Neural Networks

Extreme Learning Machine: A New Learning Scheme of Feedforward Neural Networks Extreme Learning Machine: A New Learning Scheme of Feedforward Neural Networks Guang-Bin Huang, Qin-Yu Zhu, and Chee-Kheong Siew School of Electrical and Electronic Engineering Nanyang Technological University

More information

Feature weighting using particle swarm optimization for learning vector quantization classifier

Feature weighting using particle swarm optimization for learning vector quantization classifier Journal of Physics: Conference Series PAPER OPEN ACCESS Feature weighting using particle swarm optimization for learning vector quantization classifier To cite this article: A Dongoran et al 2018 J. Phys.:

More information

A Particle Swarm Optimization Algorithm for Solving Flexible Job-Shop Scheduling Problem

A Particle Swarm Optimization Algorithm for Solving Flexible Job-Shop Scheduling Problem 2011, TextRoad Publication ISSN 2090-4304 Journal of Basic and Applied Scientific Research www.textroad.com A Particle Swarm Optimization Algorithm for Solving Flexible Job-Shop Scheduling Problem Mohammad

More information

Solving Large Regression Problems using an Ensemble of GPU-accelerated ELMs

Solving Large Regression Problems using an Ensemble of GPU-accelerated ELMs Solving Large Regression Problems using an Ensemble of GPU-accelerated ELMs Mark van Heeswijk 1 and Yoan Miche 2 and Erkki Oja 1 and Amaury Lendasse 1 1 Helsinki University of Technology - Dept. of Information

More information

ARMA MODEL SELECTION USING PARTICLE SWARM OPTIMIZATION AND AIC CRITERIA. Mark S. Voss a b. and Xin Feng.

ARMA MODEL SELECTION USING PARTICLE SWARM OPTIMIZATION AND AIC CRITERIA. Mark S. Voss a b. and Xin Feng. Copyright 2002 IFAC 5th Triennial World Congress, Barcelona, Spain ARMA MODEL SELECTION USING PARTICLE SWARM OPTIMIZATION AND AIC CRITERIA Mark S. Voss a b and Xin Feng a Department of Civil and Environmental

More information

Particle Swarm Optimization applied to Pattern Recognition

Particle Swarm Optimization applied to Pattern Recognition Particle Swarm Optimization applied to Pattern Recognition by Abel Mengistu Advisor: Dr. Raheel Ahmad CS Senior Research 2011 Manchester College May, 2011-1 - Table of Contents Introduction... - 3 - Objectives...

More information

CHAPTER 5 OPTIMAL CLUSTER-BASED RETRIEVAL

CHAPTER 5 OPTIMAL CLUSTER-BASED RETRIEVAL 85 CHAPTER 5 OPTIMAL CLUSTER-BASED RETRIEVAL 5.1 INTRODUCTION Document clustering can be applied to improve the retrieval process. Fast and high quality document clustering algorithms play an important

More information

MUSSELS WANDERING OPTIMIZATION ALGORITHM BASED TRAINING OF ARTIFICIAL NEURAL NETWORKS FOR PATTERN CLASSIFICATION

MUSSELS WANDERING OPTIMIZATION ALGORITHM BASED TRAINING OF ARTIFICIAL NEURAL NETWORKS FOR PATTERN CLASSIFICATION MUSSELS WANDERING OPTIMIZATION ALGORITHM BASED TRAINING OF ARTIFICIAL NEURAL NETWORKS FOR PATTERN CLASSIFICATION Ahmed A. Abusnaina 1 and Rosni Abdullah 2 1 School of Computer Sciences, Universiti Sains

More information

Application of Improved Discrete Particle Swarm Optimization in Logistics Distribution Routing Problem

Application of Improved Discrete Particle Swarm Optimization in Logistics Distribution Routing Problem Available online at www.sciencedirect.com Procedia Engineering 15 (2011) 3673 3677 Advanced in Control Engineeringand Information Science Application of Improved Discrete Particle Swarm Optimization in

More information

IMPROVING THE PARTICLE SWARM OPTIMIZATION ALGORITHM USING THE SIMPLEX METHOD AT LATE STAGE

IMPROVING THE PARTICLE SWARM OPTIMIZATION ALGORITHM USING THE SIMPLEX METHOD AT LATE STAGE IMPROVING THE PARTICLE SWARM OPTIMIZATION ALGORITHM USING THE SIMPLEX METHOD AT LATE STAGE Fang Wang, and Yuhui Qiu Intelligent Software and Software Engineering Laboratory, Southwest-China Normal University,

More information

SIMULTANEOUS COMPUTATION OF MODEL ORDER AND PARAMETER ESTIMATION FOR ARX MODEL BASED ON MULTI- SWARM PARTICLE SWARM OPTIMIZATION

SIMULTANEOUS COMPUTATION OF MODEL ORDER AND PARAMETER ESTIMATION FOR ARX MODEL BASED ON MULTI- SWARM PARTICLE SWARM OPTIMIZATION SIMULTANEOUS COMPUTATION OF MODEL ORDER AND PARAMETER ESTIMATION FOR ARX MODEL BASED ON MULTI- SWARM PARTICLE SWARM OPTIMIZATION Kamil Zakwan Mohd Azmi, Zuwairie Ibrahim and Dwi Pebrianti Faculty of Electrical

More information

An Approach to Polygonal Approximation of Digital CurvesBasedonDiscreteParticleSwarmAlgorithm

An Approach to Polygonal Approximation of Digital CurvesBasedonDiscreteParticleSwarmAlgorithm Journal of Universal Computer Science, vol. 13, no. 10 (2007), 1449-1461 submitted: 12/6/06, accepted: 24/10/06, appeared: 28/10/07 J.UCS An Approach to Polygonal Approximation of Digital CurvesBasedonDiscreteParticleSwarmAlgorithm

More information

A Hybrid Fireworks Optimization Method with Differential Evolution Operators

A Hybrid Fireworks Optimization Method with Differential Evolution Operators A Fireworks Optimization Method with Differential Evolution Operators YuJun Zheng a,, XinLi Xu a, HaiFeng Ling b a College of Computer Science & Technology, Zhejiang University of Technology, Hangzhou,

More information

GRID SCHEDULING USING ENHANCED PSO ALGORITHM

GRID SCHEDULING USING ENHANCED PSO ALGORITHM GRID SCHEDULING USING ENHANCED PSO ALGORITHM Mr. P.Mathiyalagan 1 U.R.Dhepthie 2 Dr. S.N.Sivanandam 3 1 Lecturer 2 Post Graduate Student 3 Professor and Head Department of Computer Science and Engineering

More information

A MULTI-SWARM PARTICLE SWARM OPTIMIZATION WITH LOCAL SEARCH ON MULTI-ROBOT SEARCH SYSTEM

A MULTI-SWARM PARTICLE SWARM OPTIMIZATION WITH LOCAL SEARCH ON MULTI-ROBOT SEARCH SYSTEM A MULTI-SWARM PARTICLE SWARM OPTIMIZATION WITH LOCAL SEARCH ON MULTI-ROBOT SEARCH SYSTEM BAHAREH NAKISA, MOHAMMAD NAIM RASTGOO, MOHAMMAD FAIDZUL NASRUDIN, MOHD ZAKREE AHMAD NAZRI Department of Computer

More information

QUANTUM BASED PSO TECHNIQUE FOR IMAGE SEGMENTATION

QUANTUM BASED PSO TECHNIQUE FOR IMAGE SEGMENTATION International Journal of Computer Engineering and Applications, Volume VIII, Issue I, Part I, October 14 QUANTUM BASED PSO TECHNIQUE FOR IMAGE SEGMENTATION Shradha Chawla 1, Vivek Panwar 2 1 Department

More information

Adaptive Multi-objective Particle Swarm Optimization Algorithm

Adaptive Multi-objective Particle Swarm Optimization Algorithm Adaptive Multi-objective Particle Swarm Optimization Algorithm P. K. Tripathi, Sanghamitra Bandyopadhyay, Senior Member, IEEE and S. K. Pal, Fellow, IEEE Abstract In this article we describe a novel Particle

More information

Experimental Study on Bound Handling Techniques for Multi-Objective Particle Swarm Optimization

Experimental Study on Bound Handling Techniques for Multi-Objective Particle Swarm Optimization Experimental Study on Bound Handling Techniques for Multi-Objective Particle Swarm Optimization adfa, p. 1, 2011. Springer-Verlag Berlin Heidelberg 2011 Devang Agarwal and Deepak Sharma Department of Mechanical

More information

Knowledge Discovery using PSO and DE Techniques

Knowledge Discovery using PSO and DE Techniques 60 CHAPTER 4 KNOWLEDGE DISCOVERY USING PSO AND DE TECHNIQUES 61 Knowledge Discovery using PSO and DE Techniques 4.1 Introduction In the recent past, there has been an enormous increase in the amount of

More information

Center-Based Sampling for Population-Based Algorithms

Center-Based Sampling for Population-Based Algorithms Center-Based Sampling for Population-Based Algorithms Shahryar Rahnamayan, Member, IEEE, G.GaryWang Abstract Population-based algorithms, such as Differential Evolution (DE), Particle Swarm Optimization

More information

An Optimization of Association Rule Mining Algorithm using Weighted Quantum behaved PSO

An Optimization of Association Rule Mining Algorithm using Weighted Quantum behaved PSO An Optimization of Association Rule Mining Algorithm using Weighted Quantum behaved PSO S.Deepa 1, M. Kalimuthu 2 1 PG Student, Department of Information Technology 2 Associate Professor, Department of

More information

CHAPTER 2 CONVENTIONAL AND NON-CONVENTIONAL TECHNIQUES TO SOLVE ORPD PROBLEM

CHAPTER 2 CONVENTIONAL AND NON-CONVENTIONAL TECHNIQUES TO SOLVE ORPD PROBLEM 20 CHAPTER 2 CONVENTIONAL AND NON-CONVENTIONAL TECHNIQUES TO SOLVE ORPD PROBLEM 2.1 CLASSIFICATION OF CONVENTIONAL TECHNIQUES Classical optimization methods can be classified into two distinct groups:

More information

Comparison of Some Evolutionary Algorithms for Approximate Solutions of Optimal Control Problems

Comparison of Some Evolutionary Algorithms for Approximate Solutions of Optimal Control Problems Australian Journal of Basic and Applied Sciences, 4(8): 3366-3382, 21 ISSN 1991-8178 Comparison of Some Evolutionary Algorithms for Approximate Solutions of Optimal Control Problems Akbar H. Borzabadi,

More information

LEARNING WEIGHTS OF FUZZY RULES BY USING GRAVITATIONAL SEARCH ALGORITHM

LEARNING WEIGHTS OF FUZZY RULES BY USING GRAVITATIONAL SEARCH ALGORITHM International Journal of Innovative Computing, Information and Control ICIC International c 2013 ISSN 1349-4198 Volume 9, Number 4, April 2013 pp. 1593 1601 LEARNING WEIGHTS OF FUZZY RULES BY USING GRAVITATIONAL

More information

Statistical Pattern Recognition

Statistical Pattern Recognition Statistical Pattern Recognition Features and Feature Selection Hamid R. Rabiee Jafar Muhammadi Spring 2014 http://ce.sharif.edu/courses/92-93/2/ce725-2/ Agenda Features and Patterns The Curse of Size and

More information

Optimization of Benchmark Functions Using Artificial Bee Colony (ABC) Algorithm

Optimization of Benchmark Functions Using Artificial Bee Colony (ABC) Algorithm IOSR Journal of Engineering (IOSRJEN) e-issn: 2250-3021, p-issn: 2278-8719 Vol. 3, Issue 10 (October. 2013), V4 PP 09-14 Optimization of Benchmark Functions Using Artificial Bee Colony (ABC) Algorithm

More information

Hamming Distance based Binary PSO for Feature Selection and Classification from high dimensional Gene Expression Data

Hamming Distance based Binary PSO for Feature Selection and Classification from high dimensional Gene Expression Data Hamming Distance based Binary PSO for Feature Selection and Classification from high dimensional Gene Expression Data Haider Banka and Suresh Dara Department of Computer Science and Engineering Indian

More information

4.12 Generalization. In back-propagation learning, as many training examples as possible are typically used.

4.12 Generalization. In back-propagation learning, as many training examples as possible are typically used. 1 4.12 Generalization In back-propagation learning, as many training examples as possible are typically used. It is hoped that the network so designed generalizes well. A network generalizes well when

More information

PARTICLE Swarm Optimization (PSO), an algorithm by

PARTICLE Swarm Optimization (PSO), an algorithm by , March 12-14, 2014, Hong Kong Cluster-based Particle Swarm Algorithm for Solving the Mastermind Problem Dan Partynski Abstract In this paper we present a metaheuristic algorithm that is inspired by Particle

More information

Discrete Particle Swarm Optimization With Local Search Strategy for Rule Classification

Discrete Particle Swarm Optimization With Local Search Strategy for Rule Classification Discrete Particle Swarm Optimization With Local Search Strategy for Rule Classification Min Chen and Simone A. Ludwig Department of Computer Science North Dakota State University Fargo, ND, USA min.chen@my.ndsu.edu,

More information

Comparing Classification Performances between Neural Networks and Particle Swarm Optimization for Traffic Sign Recognition

Comparing Classification Performances between Neural Networks and Particle Swarm Optimization for Traffic Sign Recognition Comparing Classification Performances between Neural Networks and Particle Swarm Optimization for Traffic Sign Recognition THONGCHAI SURINWARANGKOON, SUPOT NITSUWAT, ELVIN J. MOORE Department of Information

More information

Extending reservoir computing with random static projections: a hybrid between extreme learning and RC

Extending reservoir computing with random static projections: a hybrid between extreme learning and RC Extending reservoir computing with random static projections: a hybrid between extreme learning and RC John Butcher 1, David Verstraeten 2, Benjamin Schrauwen 2,CharlesDay 1 and Peter Haycock 1 1- Institute

More information

Feature Selection Using Modified-MCA Based Scoring Metric for Classification

Feature Selection Using Modified-MCA Based Scoring Metric for Classification 2011 International Conference on Information Communication and Management IPCSIT vol.16 (2011) (2011) IACSIT Press, Singapore Feature Selection Using Modified-MCA Based Scoring Metric for Classification

More information

Classification of Soil and Vegetation by Fuzzy K-means Classification and Particle Swarm Optimization

Classification of Soil and Vegetation by Fuzzy K-means Classification and Particle Swarm Optimization Classification of Soil and Vegetation by Fuzzy K-means Classification and Particle Swarm Optimization M. Chapron ETIS, ENSEA, UCP, CNRS, 6 avenue du ponceau 95014 Cergy-Pontoise, France chapron@ensea.fr

More information

arxiv: v1 [cs.ne] 17 Aug 2017

arxiv: v1 [cs.ne] 17 Aug 2017 Restricted Boltzmann machine to determine the input weights for extreme learning machines arxiv:1708.05376v1 [cs.ne] 17 Aug 2017 Andre G. C. Pacheco Graduate Program in Computer Science, PPGI Federal University

More information

Hybrid Particle Swarm-Based-Simulated Annealing Optimization Techniques

Hybrid Particle Swarm-Based-Simulated Annealing Optimization Techniques Hybrid Particle Swarm-Based-Simulated Annealing Optimization Techniques Nasser Sadati Abstract Particle Swarm Optimization (PSO) algorithms recently invented as intelligent optimizers with several highly

More information

Regularized Committee of Extreme Learning Machine for Regression Problems

Regularized Committee of Extreme Learning Machine for Regression Problems Regularized Committee of Extreme Learning Machine for Regression Problems Pablo Escandell-Montero, José M. Martínez-Martínez, Emilio Soria-Olivas, Josep Guimerá-Tomás, Marcelino Martínez-Sober and Antonio

More information

A Combinatorial Algorithm for The Cardinality Constrained Portfolio Optimization Problem

A Combinatorial Algorithm for The Cardinality Constrained Portfolio Optimization Problem 0 IEEE Congress on Evolutionary Computation (CEC) July -, 0, Beijing, China A Combinatorial Algorithm for The Cardinality Constrained Portfolio Optimization Problem Tianxiang Cui, Shi Cheng, and Ruibin

More information

OMBP: Optic Modified BackPropagation training algorithm for fast convergence of Feedforward Neural Network

OMBP: Optic Modified BackPropagation training algorithm for fast convergence of Feedforward Neural Network 2011 International Conference on Telecommunication Technology and Applications Proc.of CSIT vol.5 (2011) (2011) IACSIT Press, Singapore OMBP: Optic Modified BackPropagation training algorithm for fast

More information

Discrete Multi-Valued Particle Swarm Optimization

Discrete Multi-Valued Particle Swarm Optimization Discrete Multi-d Particle Swarm Optimization Jim Pugh and Alcherio Martinoli Swarm-Intelligent Systems Group École Polytechnique Fédérale de Lausanne 05 Lausanne, Switzerland Email: {jim.pugh,alcherio.martinoli}@epfl.ch

More information

Research Article Application of Global Optimization Methods for Feature Selection and Machine Learning

Research Article Application of Global Optimization Methods for Feature Selection and Machine Learning Mathematical Problems in Engineering Volume 2013, Article ID 241517, 8 pages http://dx.doi.org/10.1155/2013/241517 Research Article Application of Global Optimization Methods for Feature Selection and

More information

Real-Time Image Classification Using LBP and Ensembles of ELM

Real-Time Image Classification Using LBP and Ensembles of ELM SCIENTIFIC PUBLICATIONS OF THE STATE UNIVERSITY OF NOVI PAZAR SER. A: APPL. MATH. INFORM. AND MECH. vol. 8, 1 (2016), 103-111 Real-Time Image Classification Using LBP and Ensembles of ELM Stevica Cvetković,

More information

Image Classification using Fast Learning Convolutional Neural Networks

Image Classification using Fast Learning Convolutional Neural Networks , pp.50-55 http://dx.doi.org/10.14257/astl.2015.113.11 Image Classification using Fast Learning Convolutional Neural Networks Keonhee Lee 1 and Dong-Chul Park 2 1 Software Device Research Center Korea

More information

PSO-BP ALGORITHM IMPLEMENTATION FOR MATERIAL SURFACE IMAGE IDENTIFICATION

PSO-BP ALGORITHM IMPLEMENTATION FOR MATERIAL SURFACE IMAGE IDENTIFICATION PSO-BP ALGORITHM IMPLEMENTATION FOR MATERIAL SURFACE IMAGE IDENTIFICATION Fathin Liyana Zainudin 1, Abd Kadir Mahamad 1, Sharifah Saon 1, and Musli Nizam Yahya 2 1 Faculty of Electrical and Electronics

More information

Discrete Particle Swarm Optimization for TSP based on Neighborhood

Discrete Particle Swarm Optimization for TSP based on Neighborhood Journal of Computational Information Systems 6:0 (200) 3407-344 Available at http://www.jofcis.com Discrete Particle Swarm Optimization for TSP based on Neighborhood Huilian FAN School of Mathematics and

More information

Wrapper Feature Selection using Discrete Cuckoo Optimization Algorithm Abstract S.J. Mousavirad and H. Ebrahimpour-Komleh* 1 Department of Computer and Electrical Engineering, University of Kashan, Kashan,

More information

Semi-Supervised Clustering with Partial Background Information

Semi-Supervised Clustering with Partial Background Information Semi-Supervised Clustering with Partial Background Information Jing Gao Pang-Ning Tan Haibin Cheng Abstract Incorporating background knowledge into unsupervised clustering algorithms has been the subject

More information

Comparative Study of Instance Based Learning and Back Propagation for Classification Problems

Comparative Study of Instance Based Learning and Back Propagation for Classification Problems Comparative Study of Instance Based Learning and Back Propagation for Classification Problems 1 Nadia Kanwal, 2 Erkan Bostanci 1 Department of Computer Science, Lahore College for Women University, Lahore,

More information

Tracking Changing Extrema with Particle Swarm Optimizer

Tracking Changing Extrema with Particle Swarm Optimizer Tracking Changing Extrema with Particle Swarm Optimizer Anthony Carlisle Department of Mathematical and Computer Sciences, Huntingdon College antho@huntingdon.edu Abstract The modification of the Particle

More information

Dataset Watermarking Using Hybrid Optimization and ELM

Dataset Watermarking Using Hybrid Optimization and ELM Dataset Watermarking Using Hybrid Optimization and ELM Vithyaa Raajamanickam 1,Ms.R. Mahalakshmi 2 1 Research Scholar, Park's college Chinnakarai,Tirupur 641605 2 Assistant Professor, Park's college Chinnakarai,Tirupur

More information

Particle Swarm Optimization Artificial Bee Colony Chain (PSOABCC): A Hybrid Meteahuristic Algorithm

Particle Swarm Optimization Artificial Bee Colony Chain (PSOABCC): A Hybrid Meteahuristic Algorithm Particle Swarm Optimization Artificial Bee Colony Chain (PSOABCC): A Hybrid Meteahuristic Algorithm Oğuz Altun Department of Computer Engineering Yildiz Technical University Istanbul, Turkey oaltun@yildiz.edu.tr

More information

GENETIC ALGORITHM VERSUS PARTICLE SWARM OPTIMIZATION IN N-QUEEN PROBLEM

GENETIC ALGORITHM VERSUS PARTICLE SWARM OPTIMIZATION IN N-QUEEN PROBLEM Journal of Al-Nahrain University Vol.10(2), December, 2007, pp.172-177 Science GENETIC ALGORITHM VERSUS PARTICLE SWARM OPTIMIZATION IN N-QUEEN PROBLEM * Azhar W. Hammad, ** Dr. Ban N. Thannoon Al-Nahrain

More information

Statistical Pattern Recognition

Statistical Pattern Recognition Statistical Pattern Recognition Features and Feature Selection Hamid R. Rabiee Jafar Muhammadi Spring 2012 http://ce.sharif.edu/courses/90-91/2/ce725-1/ Agenda Features and Patterns The Curse of Size and

More information

Statistical Pattern Recognition

Statistical Pattern Recognition Statistical Pattern Recognition Features and Feature Selection Hamid R. Rabiee Jafar Muhammadi Spring 2013 http://ce.sharif.edu/courses/91-92/2/ce725-1/ Agenda Features and Patterns The Curse of Size and

More information

Using a genetic algorithm for editing k-nearest neighbor classifiers

Using a genetic algorithm for editing k-nearest neighbor classifiers Using a genetic algorithm for editing k-nearest neighbor classifiers R. Gil-Pita 1 and X. Yao 23 1 Teoría de la Señal y Comunicaciones, Universidad de Alcalá, Madrid (SPAIN) 2 Computer Sciences Department,

More information

Regularized Fuzzy Neural Networks for Pattern Classification Problems

Regularized Fuzzy Neural Networks for Pattern Classification Problems Regularized Fuzzy Neural Networks for Pattern Classification Problems Paulo Vitor C. Souza System Analyst, Secretariat of Information Governance- CEFET-MG, Teacher, University Center Of Belo Horizonte-

More information

Nelder-Mead Enhanced Extreme Learning Machine

Nelder-Mead Enhanced Extreme Learning Machine Philip Reiner, Bogdan M. Wilamowski, "Nelder-Mead Enhanced Extreme Learning Machine", 7-th IEEE Intelligent Engineering Systems Conference, INES 23, Costa Rica, June 9-2., 29, pp. 225-23 Nelder-Mead Enhanced

More information

A Polar Coordinate Particle Swarm Optimiser

A Polar Coordinate Particle Swarm Optimiser A Polar Coordinate Particle Swarm Optimiser Wiehann Matthysen and Andries P. Engelbrecht Department of Computer Science, University of Pretoria, South Africa engel@cs.up.ac.za Abstract The Particle Swarm

More information

This leads to our algorithm which is outlined in Section III, along with a tabular summary of it's performance on several benchmarks. The last section

This leads to our algorithm which is outlined in Section III, along with a tabular summary of it's performance on several benchmarks. The last section An Algorithm for Incremental Construction of Feedforward Networks of Threshold Units with Real Valued Inputs Dhananjay S. Phatak Electrical Engineering Department State University of New York, Binghamton,

More information

Meta- Heuristic based Optimization Algorithms: A Comparative Study of Genetic Algorithm and Particle Swarm Optimization

Meta- Heuristic based Optimization Algorithms: A Comparative Study of Genetic Algorithm and Particle Swarm Optimization 2017 2 nd International Electrical Engineering Conference (IEEC 2017) May. 19 th -20 th, 2017 at IEP Centre, Karachi, Pakistan Meta- Heuristic based Optimization Algorithms: A Comparative Study of Genetic

More information

A PSO-based Generic Classifier Design and Weka Implementation Study

A PSO-based Generic Classifier Design and Weka Implementation Study International Forum on Mechanical, Control and Automation (IFMCA 16) A PSO-based Generic Classifier Design and Weka Implementation Study Hui HU1, a Xiaodong MAO1, b Qin XI1, c 1 School of Economics and

More information

Feature Selection Algorithm with Discretization and PSO Search Methods for Continuous Attributes

Feature Selection Algorithm with Discretization and PSO Search Methods for Continuous Attributes Feature Selection Algorithm with Discretization and PSO Search Methods for Continuous Attributes Madhu.G 1, Rajinikanth.T.V 2, Govardhan.A 3 1 Dept of Information Technology, VNRVJIET, Hyderabad-90, INDIA,

More information

A Modified PSO Technique for the Coordination Problem in Presence of DG

A Modified PSO Technique for the Coordination Problem in Presence of DG A Modified PSO Technique for the Coordination Problem in Presence of DG M. El-Saadawi A. Hassan M. Saeed Dept. of Electrical Engineering, Faculty of Engineering, Mansoura University, Egypt saadawi1@gmail.com-

More information