Application of the Evolutionary Algorithms for Classifier Selection in Multiple Classifier Systems with Majority Voting

Size: px
Start display at page:

Download "Application of the Evolutionary Algorithms for Classifier Selection in Multiple Classifier Systems with Majority Voting"

Transcription

1 Application of the Evolutionary Algorithms for Classifier Selection in Multiple Classifier Systems with Majority Voting Dymitr Ruta and Bogdan Gabrys Applied Computational Intelligence Research Unit, Division of Computing and Information Systems, University of Paisley, High Street, Paisley PA1-2EP, United Kingdom {ruta-ci0, Abstract. In many pattern recognition tasks, an approach based on combining classifiers has shown a significant potential gain in comparison to the performance of an individual best classifier. This improvement turned out to be subject to a sufficient level of diversity exhibited among classifiers, which in general can be assumed as a selective property of classifier subsets. Given a large number of classifiers, an intelligent classifier selection process becomes a crucial issue of multiple classifier system design. In this paper, we have investigated three evolutionary optimization methods for the classifier selection task. Based on our previous studies of various diversity measures and their correlation with majority voting error we have adopted majority voting performance computed for the validation set directly as a fitness function guiding the search. To prevent from training data overfitting we extracted a population of best unique classifier combinations, and used them for second stage majority voting. In this work we intend to show empirically, that using efficient evolutionary-based selection leads to the results comparable to absolutely best, found exhaustively. Moreover, as we showed for selected datasets, introducing a second stage combining by majority voting has the potential for both, further improvement of the recognition rate and increase of the reliability of combined outputs. 1 Introduction A research devoted to pattern recognition proves that no individual method can be shown to be the best for all classification tasks [1]. As a result increasing effort is being directed towards the development of fusion methods hoping to achieve improved and stable classification performance for a wider family of pattern recognition problems. Indeed, recently classifier fusion has been shown to outperform the traditional, single-best classifier approach in many different applications [1-5]. In the safety critical systems, where the decisions taken are of crucial importance, any method offering improvement of the classification rate is invaluable even if it leads to higher complexity of the model. In such cases, the design of a reliable classification

2 model should start from pooling of all available classifiers, ensuring that no potentially supporting information is wasted. Given a large number of different classifiers it is always a question of how many and which ones to select for combining in order to achieve the highest performance of the fusion method. So far, a dominating approach was to pick several best classifiers, which commonly resulted in moderate improvement. As recently shown [4], picking several best classifiers does not necessarily lead to the best or sometimes even good solution. Further analysis revealed that in addition to individual performances, diversity among classifiers has to be taken into account for selection purposes [6,7]. Effectively, only reasonably diverse, and in particular negatively dependent classifiers could offer large improvement of the classification performance [8,9]. This fact imposes the necessity for selecting the most diverse classifiers, which are most likely to produce robust combined results. Many different scientists tried to apply different measures of diversity to select the best team of classifiers [7,10,11]. As shown in [11] for majority voting diversity measures are particularly good at reduction of system complexity but selection based on the diversity measures appears to be rather imprecise and limited to the lower order dependencies. Moreover, there are problems with consistent evaluation of the diversity for variable number of classifiers involved. An alternative to imprecise diversity-based selection is a direct search using the performance of combiner as selection criterion. There is no imprecision inflicted by diversity measures and the performance of selection process relies fully on the quality of searching algorithms applied. However this strategy imposes operating on an exponentially complex and rough searching space. Genetic algorithms as well as other evolutionary algorithms have been shown to deal well and efficiently with large, rough searching spaces [12,13,16-18]. In this paper we assume selection from a large number of classifiers and having all available information in the form of hardened binary classifier outputs (correct/incorrect) obtained from classification performed over validation set. We used majority voting (MV) as a combination method relevant for the assumptions mentioned above. Although very simple, MV quite often showed the performance comparable to the much more advanced techniques [14,15]. Moreover, as shown in [8,9] theoretical possibilities of classification improvement using MV are tremendous especially using large number of classifiers. Facing a large and rough searching space, we applied well-known evolutionary algorithms: genetic algorithm (GA) [16], tabu search (TS) [17] and population-based incremental learning (PBIL) [18]. The selected algorithms represent quite different approaches of evolutionary learning. We intend to use these algorithms for efficient searching for a unique population of best classifier combinations and combine them further to improve reliability of obtained solutions. The remaining of the paper is organized as follows. Section 2 explains the problem of classifier selection for the optimal MV performance. In section 3 we give a detailed analysis of the presented searching algorithms: GA, TS and PBIL and show implementation solutions and adjustments needed to reach a satisfactory selection quality and compatibility with MV. Section 4 provides the results from the experiments with real datasets. Finally, summary and conclusions are given in section 5.

3 2 Classifier Selection for Majority Voting Even using simple majority voting the classifier selection process is far from trivial. The first problem is the representation of a single combination of classifiers, which ought to be uniform throughout the searching process regardless of the number of classifiers used. We chose binary strings used in GA s, in which bit in j th position indicated inclusion (1) or exclusion (0) of the j th classifier in the further fusion process. Another problem is designing the fitness function. Based on our previous studies in [11] we decided not to use diversity measures which are vaguely correlated with MV performance. In our case, the searching algorithms account for system simplification and there is no need to use imprecise diversity measures. Therefore, we employed directly MV performance as a fitness function. For smaller number of classifiers, the quality of searching can be always inspected by comparing it with the results of an exhaustive search. For larger systems, due to exponential complexity, the exhaustive search very quickly becomes intractable and taking very rough searching space into account, the global optimum is rarely known. Given a system of M classifiers: D = {D,...,D }, let y = ( y, y,... y ) denote a 1 M i 1i 2i Mi joint output of a system for i th multidimensional input sample x, where i y = y ( ) i = 1,..., N j = 1,...,M ji j x denote the hardened output of the j th classifier for i data sample x. In this work we assume the transformed binary outputs to be y = 1 i ji for correct classification and y = 0 for misclassification. Let v = ( v, v,... v ) ji k k 0 k1 km represent a combination of classifiers, where v = {0,1 } indicates inclusion (1) or kj exclusion (0) of the j th classifier in the decision fusion. Given a combination v, the k MV y v can be obtained by: combined decision produced by MV combiner ( ) y MV i ( v ) k 0 if = 1 if M j= 1 M j= 1 Given a validation set X ( x x,..., x ) y y ji ji v v kj kj > i M M k v j= 1 kj v j= 1 kj =,, the selection can be reformulated as a VA 1 2 N simple optimization process where the object of optimization is v and the fitness k function, which we used in this study, is represented by the following formula: y MV [ ] N N MV ( ) y ( v ) = k i = 1 k / 2 / 2 (1) v (2) MV definition shown above imposes further irregularities. Namely, it enforces combining only odd number of classifiers. Otherwise one would have to implement a rejection rule observed for an equal number of contradictory votes, which brings additional complexity to the system. Another problem is that even assuming that the global best validation combination is found, for the testing set it may be no longer the optimal selection. In order to avoid this problem, instead of obtaining single best combination, we intend to extract a population of best solutions and apply them all for the second stage combining process. In the experimental section we illustrate the advantage of the second stage combining, which resulted in improved classification performance and reduced variability of classification performance.

4 3 Searching Algorithms The choice of searching algorithms to be used for classifier selection has been dictated by several requirements. On one hand the algorithms should quickly and efficiently explore large searching spaces formed by a number of possible subsets of classifiers. Secondly, as mentioned above, combining at the second stage is to be pursued in this work. Therefore, rather than a single best solution, a population of best combinations found should be returned as an output from a searching process. Furthermore the algorithms should be rather sensitive to searching criterion as due to common high positive correlations among classifiers the differences in combining performance is expected to be small. For the reasons above, we proposed to use evolutionary algorithms operating directly on combining performance. Three examples of such algorithms are here investigated and specifically implemented for the use with majority voting combiner. 3.1 Genetic Algorithms Genetic algorithms (GA) have been used for a number of pattern recognition problems [12,13,16]. There are several problems in adopting GA to classifier selection for combining with MV. The major problem derives from the constraint of odd number of classifiers that has to be imposed. To keep the number of selected classifiers odd throughout the searching process, crossover and mutation operators have to be specially designed. Mutation is rather easy to implement as assuming already odd number of classifiers set randomly in initialization process, the odd number of selected classifiers can be preserved by mutating a pair of bits or in general any even number of bits. Crossover is much more difficult to control that way. To avoid making GA too complex, crossover is performed traditionally and after that, if the offspring contains even number of classifiers one randomly selected bit is additionally mutated to bring back the odd number of 1 s in the chromosome. To increase exploration ability of GA we introduced additional operator of pairwise exchange, which simply swaps random pair of bits within the chromosome preserving the same number of classifiers. In order to preserve the best combinations from generation to generation we applied a specific selection rule according to which populations of parents and offsprings are put together and then a number of best chromosomes equal to the size of population is selected for the next generation. Being aware of the potential generalization problems, we have developed a simple diversifying operator. It enforces all chromosomes to be different from each other (unique), by mutating random bits until this requirement is reached. The whole algorithm can be defined as follows: 1. Initialize a random population of n chromosomes 2. Calculate fitness (MV performance) for each chromosome 3. Perform crossover and mutate single bits of offsprings with even number of 1 s 4. Mutate all offsprings at randomly selected one or many points 5. Apply one or more pairwise exchanges for each offspring 6. From all offsprings and parents select n best unique chromosomes to the next generation 7. If convergence is reached then finish, else go to step 2

5 Although this particular version of GA represents hill-climbing algorithm, multiple mutation and pairwise exchange together with diversifying operator substantially extend the exploration abilities of the algorithm. Convergence condition can be associated with the case when no change in the mean MV error is observed for arbitrarily large number of generations. Preliminary experiments with real classification datasets confirmed superiority of the presented version of GA to its standard definition and highlighted the importance of diversifying operator for classifier selection process. 3.2 Tabu Search Tabu search (TS) in its standard form is not a population-based algorithm yet shares some similarities with GA s particularly in the encoding of the problem [17]. Instead of a population, it uses only single chromosome, mutated randomly at each step. Due to this fact there can be no crossover and the only genetic change is provided by mutation. This limits strongly an ability of the algorithm to jump into different regions of the searching space. Moreover, it represents a hill-climbing algorithm, which reaches convergence much faster than typical GA, but on the other hand, a global optimum may not be found, as it simply may be unreachable from initial conditions. Effectively, the tabu search in its original version quite easily gets trapped in local optima. To prevent from such effects we applied multiple consecutive mutations together with pairwise exchange before the fitness is examined. Similarly to GA we keep the population of unique best chromosomes found during the process. As for the previous algorithm, convergence condition is satisfied if a pool of k best solutions is not changed for a fixed number of generations. The presented version of TS algorithm can be described in the following steps: 1. Create a single random chromosome 2. Mutate the chromosome at randomly selected one or many points 3. Apply one or more pairwise exchanges 4. Test the fitness of the new chromosome: if it is fitter than the changes are accepted 5. Store the new chromosome if it is among k unique best solutions found so far 6. If convergence is reached finish, else go to step Population-Based Incremental Learning Due to the lack of crossover operator, even after many adjustments tabu search partially loses the ability to explore the whole searching space. There is a possibility to regain the ability of the algorithm to reach most points of the searching space, while keeping convergence property at the satisfactory level. The algorithm offering these properties is called a population-based incremental learning (PBIL) [18]. It also uses a population of chromosomes, sampled from a special probability vector, which is updated at each step according to the fittest chromosomes. The update process of the probability vector is performed according to a standard supervised learning method. Given probability vector p = p, p,..., p ), and popula- ( 1 2 M

6 tion of chromosomes G = ( v, v,..., v ), where v = ( v, v,..., v ), each probability bit is updated as in the following expression: 1 2 C k k1 k 2 km new old C p = p + p p = η [( v ) C p ] j j j j (3) k =1 kj j where k = 1,..., C refers to the C fittest chromosomes found and η controls the magnitude of the update. A number of best chromosomes taken to update the probability vector together with the magnitude factor η control a balance between a speed of reaching the convergence and the ability to explore the whole search space. Convergence is reached if the probability vector contains only integer values: 0 or 1. In such a case p becomes the best combination of classifiers. As we are rather in favor of obtaining a population of best solutions, they are extracted and stored during the process preserving diversity rule as in the previous algorithms. The PBIL algorithm can be described in the following steps: 1. Create probability vector of the same length as the required chromosome and initialize it with values of 0.5 at each bit 2. Sample a number of chromosomes according to the probability vector 3. Update the probability vector by increasing probabilities in positions where the fittest chromosomes had 1 s 4. Update the pool of k best unique solutions 5. If all elements in probability vector are 0 or 1 then finish, else go to step 2 Although PBIL algorithm does not use any genetic operators observed in GA, it contains a specific mechanism that allows exploiting beneficial information through generations, and thus preserves the stochastic elements of evolutionary algorithms. 4 Experiments The experiments have been organized in two groups. In the first part the presented algorithms have been examined for three realistic datasets 1 from UCI Repository 2 and compared against simple alternative strategies: exhaustive search (ES), the single-best classifier (SB) and a random search (RS). Selection was performed from a set of 15 different classifiers available from PRTOOLS Finally, in the second part of the experiments we investigated the possibility of combining at the second stage by combining MV outputs from the selections found as best at the first level. In all experiments, we used the same parameters of the algorithms, for which preliminary experiments showed the best results. Both PBIL and GA used 50 chromo- 1 Datasets: Iris recognition of the types of iris plant: 150 samples, 4 features, 3 classes; Cancer cancer diagnosis: 569 samples, 30 features, 2 classes; Diabetes diabetes diagnosis: 768 samples, 8 features, 2 classes 2 University of California Repository of Machine Learning Databases and Domain Theories, available free at: ftp.ics.uci.edu/pub/machine-learning-databases 3 Pattern Recognition Toolbox for Matlab 5.0+, implemented by R.P.W. Duin, available free at: ftp://ftp.ph.tn.tudelft.nl/pub/bob/prtools

7 somes in the population. In TS and GA single bit mutation has been applied together with single pairwise exchange. The learning rate for PBIL was set to η = 1. The best validation combinations have been examined also for a testing set to evaluate the generalization ability. To be able to compare the algorithms in terms of efficiency, in all experiments the algorithms finished the run after examining a fixed number of chromosomes, which was used instead of specifying convergence conditions. Given a pool of M=15 different classifiers has been applied for 3 datasets from UCI Repository. All the datasets have been split into equally sized: training, validation and testing sets. Trained classifiers were then applied for a classification performed over the validation and testing set. Trying to reliably estimate true performances, we repeated this process for many random splits of the dataset, until we obtained binary matrices of size N=5000 containing classification results separately for the validation and testing set. Searching algorithms have been applied for the validation matrix. The searching results for the first stage MV combining are shown in Table 1. For all presented datasets the performances of the best selections found by the proposed searching algorithms were better than those quickly given by SB selection and were very close to the obtainable boundaries determined by the ES. The time of searching was however substantially reduced in comparison with ES. For larger number of classifiers ES starts to be intractable, whilst the searching time for the presented searching algorithms increases slowly. Moreover the balance between searching precision and the time of searching is adjustable and can be controlled by the search method parameters. Table 1. MV performance (BV) and an average from 50 best (BV50) selections found by the searching algorithms from a validation matrix ( ) obtained from classification of Iris, Cancer and Diabetes datasets by 15 different classifiers. The last two rows, contain testing matrix ( ) results: T(BV) and T(BV50) for the same selections. The time of searching corresponds to the time of checking 1000 different selections by each algorithm IRIS SB ES RS TS PBIL GA Time [s] BV [%] BV50 [%] T(BV) [%] T(BV50) [%] CANCER SB ES RS TS PBIL GA Time [s] BV [%] BV50 [%] T(BV) [%] T(BV50) [%] DIABETES SB ES RS TS PBIL GA Time [s] BV [%] BV50 [%] T(BV) [%] T(BV50) [%]

8 100 Iris dataset: PBIL vs. SB MV performance [%] Cancer dataset: TS vs. SB MV performance [%] Diabetes dataset: GA vs. SB MV performance [%] Fig. 1. Performance of the second stage MV combiner compared against mean MV performance of the best 50 validation combinations and single best classifier. supported by the statistics of variability along different random splits of the datasets. The graphs correspond to the datasets from Table 1 and relate to the testing set performance. The shaded area limited by dashed lines together with doted line in the middle represent SB confidence intervals and the mean MV performance of the classifier selected by SB strategy, respectively. Grey solid line shows the mean MV performance of the best 50 validation selections with their confidence intervals. Black solid line represents MV performance of the second stage MV combining shown as a function of the number of the best validation combinations with corresponding confidence intervals. All the confidence intervals have been obtained by calculating the means over different splits and taking 3 times the standard deviation.

9 4.2 Experiment 2 In this experiment, we looked at generalization ability of analyzed selection algorithms. For that purpose we examined the idea of introducing a second stage of combining and its implications for variability of obtained results. We prepared statistics of MV performances of the best selections obtained over consecutive splits of the examined datasets. Calculating means and standard deviations of MV performance varying along different splits allowed to estimate the reliability of the selected models and compare it against SB approach. The results for the 50 best combinations of classifiers (represented by thick gray lines) and SB (represented by dotted black lines) are illustrated in Fig 1. It can be seen that counting only on the best selection found for validation set is in general risky. This is due to the generalization dilemma especially evident for small amount of training data. A better and more reliable strategy turned out to be taking the MV outputs from a number of best validation selections and obtaining a final decision by second-stage majority voting. The performance of the second stage MV combiner is shown by thick black lines in Fig. 1. The plots show slight improvement in comparison to any individual combination and also prove that this strategy is much more reliable and stable in terms of different number of selections taken. Reliability improvement in comparison to SB results stems from decreased variance imposed by aggregation of outputs. 5 Conclusions In this paper, we studied the applicability of three evolutionary optimization techniques for the problem of classifier selection for combining by majority voting operating on binary classification outputs. Introducing binary-strings representation of classifier combinations, we proposed specific implementations of genetic algorithm, tabu search and probability-based incremental learning applied for the constrained majority voting rule accepting only odd number of classifiers. Facing a huge and rough searching space we assigned directly majority voting performance as a fitness function and put the main effort to develop searching algorithms with high exploration capabilities and simultaneously working fast to be applicable for a large number of classifiers. Comparing the efficiency of searching with an exhaustive search we obtained mostly the same best selections while substantially reducing the time of searching. For all experiments we recorded improvement of majority voting performance of the best selections found in comparison with the simple single-best selection strategy. Moreover, due to aggregation applied we observed increased reliability of the best selections evident in the form of reduced variance of the majority voting performance from different splits of datasets. Nevertheless the best validation selection not necessarily has to be the best for the testing set. So can be any individual selection found among the best solutions. To avoid this risk we applied second stage combining applying majority voting for the MV outputs of the best solutions at the first stage. This strategy turned out to be successful and produced the results slightly better than individual

10 selections but more importantly improving also reliability and stability of the output performance. These results allow choosing arbitrarily large number of the best selections for a second-stage fusion without risking dramatic loss of generalization ability, and at the same time preserving the general good performance of the system. References 1. Bezdek J.C.: Fuzzy Models and Algorithms for Pattern Recognition and Image Processing. Kluwer Academic Boston (1999) 2. Sharkey A.J.C.: Combining Artificial Neural Nets: Ensemble and Modular Multi-net Systems. Springer-Verlag, Berlin Heidelberg New York (1999) 3. Zhilkin P.A., Somorjai R.L.: Application of Several Methods of Classification Fusion to Magnetic Resonance Spectra. Connection Science 8(3,4) (1996) Rogova G.: Combining the Results of Several Neural Network Classifiers. Neural Networks 7(5) (1994) Xu L., Krzyzak A.: Methods of Combining Multiple Classifiers and Their Applications to Handwriting Recognition. IEEE Transactions on Systems, Man, and Cybernetics 23(8) (1992) Partridge D., Griffith N.: Strategies for Improving Neural Net Generalization. Neural Computing and Applications 3 (1995) Sharkey A.J.C., Sharkey N.E.: Combining Diverse Neural Nets. The Knowledge Engineering Review 12(3) (1997) Kuncheva L.I., Whitaker C.J., Shipp C.A., Duin R.P.W.: Limits on the Majority Vote Accuracy in Classifier Fusion. Submitted to IEEE Transactions on Pattern Analysis and Machine Intelligence 9. Ruta D., Gabrys B.: A Theoretical Analysis of the Limits of Majority Voting in Multiple Classifier Systems. Technical Report No. 11. University of Paisley (2000) 10. Kuncheva L.I., Whitaker C.J.: Measures of Diversity in Classifier Ensembles. Submitted to Machine Learning 11. Ruta D., Gabrys B.: Analysis of the Correlation Between Majority Voting Errors and the Diversity Measures in Multiple Classifier Systems. Accepted for the International Symposium on Soft Computing SOCO Kuncheva L., Jain L.C.: Designing Classifier Fusion Systems by Genetic Algorithms. To appear in IEEE Transactions on Evolutionary Computation 13. Cho S.B.: Pattern Recognition With Neural Networks Combined by Genetic Algorithms. Fuzzy Sets and Systems 103 (1999) Cho S.B., Kim J.H.: Combining Multiple Neural Networks by Fuzzy Integral for Robust Classification. IEEE Trans. on Systems, Man, and Cybernetics 25(2) (1995) Kuncheva L.I., Bezdek J.C.: On Combining Classifiers by Fuzzy Templates. Proc. NAFIPS 98, Pensacola, FL (1998) Davis L.: Handbook of Genetic Algorithms. Van Nostrand Reinhold New York (1991) 17. Glover F., Laguna M.: Tabu Search. Kluver Academic Publishers Boston (1997) 18. Baluja S.: Population-Based Incremental Learning: A Method for Integrating Genetic Search Based Function Optimization and Competitive Learning. Technical Report No Carnegie Melon University, Pittsburgh PA (1994)

Multiple Classifier Fusion using k-nearest Localized Templates

Multiple Classifier Fusion using k-nearest Localized Templates Multiple Classifier Fusion using k-nearest Localized Templates Jun-Ki Min and Sung-Bae Cho Department of Computer Science, Yonsei University Biometrics Engineering Research Center 134 Shinchon-dong, Sudaemoon-ku,

More information

Escaping Local Optima: Genetic Algorithm

Escaping Local Optima: Genetic Algorithm Artificial Intelligence Escaping Local Optima: Genetic Algorithm Dae-Won Kim School of Computer Science & Engineering Chung-Ang University We re trying to escape local optima To achieve this, we have learned

More information

CHAPTER 6 HYBRID AI BASED IMAGE CLASSIFICATION TECHNIQUES

CHAPTER 6 HYBRID AI BASED IMAGE CLASSIFICATION TECHNIQUES CHAPTER 6 HYBRID AI BASED IMAGE CLASSIFICATION TECHNIQUES 6.1 INTRODUCTION The exploration of applications of ANN for image classification has yielded satisfactory results. But, the scope for improving

More information

Evolving SQL Queries for Data Mining

Evolving SQL Queries for Data Mining Evolving SQL Queries for Data Mining Majid Salim and Xin Yao School of Computer Science, The University of Birmingham Edgbaston, Birmingham B15 2TT, UK {msc30mms,x.yao}@cs.bham.ac.uk Abstract. This paper

More information

Improving interpretability in approximative fuzzy models via multi-objective evolutionary algorithms.

Improving interpretability in approximative fuzzy models via multi-objective evolutionary algorithms. Improving interpretability in approximative fuzzy models via multi-objective evolutionary algorithms. Gómez-Skarmeta, A.F. University of Murcia skarmeta@dif.um.es Jiménez, F. University of Murcia fernan@dif.um.es

More information

The Genetic Algorithm for finding the maxima of single-variable functions

The Genetic Algorithm for finding the maxima of single-variable functions Research Inventy: International Journal Of Engineering And Science Vol.4, Issue 3(March 2014), PP 46-54 Issn (e): 2278-4721, Issn (p):2319-6483, www.researchinventy.com The Genetic Algorithm for finding

More information

Random Search Report An objective look at random search performance for 4 problem sets

Random Search Report An objective look at random search performance for 4 problem sets Random Search Report An objective look at random search performance for 4 problem sets Dudon Wai Georgia Institute of Technology CS 7641: Machine Learning Atlanta, GA dwai3@gatech.edu Abstract: This report

More information

Genetic algorithms and finite element coupling for mechanical optimization

Genetic algorithms and finite element coupling for mechanical optimization Computer Aided Optimum Design in Engineering X 87 Genetic algorithms and finite element coupling for mechanical optimization G. Corriveau, R. Guilbault & A. Tahan Department of Mechanical Engineering,

More information

HYBRID GENETIC ALGORITHM WITH GREAT DELUGE TO SOLVE CONSTRAINED OPTIMIZATION PROBLEMS

HYBRID GENETIC ALGORITHM WITH GREAT DELUGE TO SOLVE CONSTRAINED OPTIMIZATION PROBLEMS HYBRID GENETIC ALGORITHM WITH GREAT DELUGE TO SOLVE CONSTRAINED OPTIMIZATION PROBLEMS NABEEL AL-MILLI Financial and Business Administration and Computer Science Department Zarqa University College Al-Balqa'

More information

1. Introduction. 2. Motivation and Problem Definition. Volume 8 Issue 2, February Susmita Mohapatra

1. Introduction. 2. Motivation and Problem Definition. Volume 8 Issue 2, February Susmita Mohapatra Pattern Recall Analysis of the Hopfield Neural Network with a Genetic Algorithm Susmita Mohapatra Department of Computer Science, Utkal University, India Abstract: This paper is focused on the implementation

More information

Metaheuristic Optimization with Evolver, Genocop and OptQuest

Metaheuristic Optimization with Evolver, Genocop and OptQuest Metaheuristic Optimization with Evolver, Genocop and OptQuest MANUEL LAGUNA Graduate School of Business Administration University of Colorado, Boulder, CO 80309-0419 Manuel.Laguna@Colorado.EDU Last revision:

More information

REAL-CODED GENETIC ALGORITHMS CONSTRAINED OPTIMIZATION. Nedim TUTKUN

REAL-CODED GENETIC ALGORITHMS CONSTRAINED OPTIMIZATION. Nedim TUTKUN REAL-CODED GENETIC ALGORITHMS CONSTRAINED OPTIMIZATION Nedim TUTKUN nedimtutkun@gmail.com Outlines Unconstrained Optimization Ackley s Function GA Approach for Ackley s Function Nonlinear Programming Penalty

More information

A Genetic Algorithm for Graph Matching using Graph Node Characteristics 1 2

A Genetic Algorithm for Graph Matching using Graph Node Characteristics 1 2 Chapter 5 A Genetic Algorithm for Graph Matching using Graph Node Characteristics 1 2 Graph Matching has attracted the exploration of applying new computing paradigms because of the large number of applications

More information

Using a genetic algorithm for editing k-nearest neighbor classifiers

Using a genetic algorithm for editing k-nearest neighbor classifiers Using a genetic algorithm for editing k-nearest neighbor classifiers R. Gil-Pita 1 and X. Yao 23 1 Teoría de la Señal y Comunicaciones, Universidad de Alcalá, Madrid (SPAIN) 2 Computer Sciences Department,

More information

Random projection for non-gaussian mixture models

Random projection for non-gaussian mixture models Random projection for non-gaussian mixture models Győző Gidófalvi Department of Computer Science and Engineering University of California, San Diego La Jolla, CA 92037 gyozo@cs.ucsd.edu Abstract Recently,

More information

Information Fusion Dr. B. K. Panigrahi

Information Fusion Dr. B. K. Panigrahi Information Fusion By Dr. B. K. Panigrahi Asst. Professor Department of Electrical Engineering IIT Delhi, New Delhi-110016 01/12/2007 1 Introduction Classification OUTLINE K-fold cross Validation Feature

More information

Time Complexity Analysis of the Genetic Algorithm Clustering Method

Time Complexity Analysis of the Genetic Algorithm Clustering Method Time Complexity Analysis of the Genetic Algorithm Clustering Method Z. M. NOPIAH, M. I. KHAIRIR, S. ABDULLAH, M. N. BAHARIN, and A. ARIFIN Department of Mechanical and Materials Engineering Universiti

More information

Literature Review On Implementing Binary Knapsack problem

Literature Review On Implementing Binary Knapsack problem Literature Review On Implementing Binary Knapsack problem Ms. Niyati Raj, Prof. Jahnavi Vitthalpura PG student Department of Information Technology, L.D. College of Engineering, Ahmedabad, India Assistant

More information

ISSN: [Keswani* et al., 7(1): January, 2018] Impact Factor: 4.116

ISSN: [Keswani* et al., 7(1): January, 2018] Impact Factor: 4.116 IJESRT INTERNATIONAL JOURNAL OF ENGINEERING SCIENCES & RESEARCH TECHNOLOGY AUTOMATIC TEST CASE GENERATION FOR PERFORMANCE ENHANCEMENT OF SOFTWARE THROUGH GENETIC ALGORITHM AND RANDOM TESTING Bright Keswani,

More information

Inducing Parameters of a Decision Tree for Expert System Shell McESE by Genetic Algorithm

Inducing Parameters of a Decision Tree for Expert System Shell McESE by Genetic Algorithm Inducing Parameters of a Decision Tree for Expert System Shell McESE by Genetic Algorithm I. Bruha and F. Franek Dept of Computing & Software, McMaster University Hamilton, Ont., Canada, L8S4K1 Email:

More information

Dynamic Ensemble Construction via Heuristic Optimization

Dynamic Ensemble Construction via Heuristic Optimization Dynamic Ensemble Construction via Heuristic Optimization Şenay Yaşar Sağlam and W. Nick Street Department of Management Sciences The University of Iowa Abstract Classifier ensembles, in which multiple

More information

Design of Nearest Neighbor Classifiers Using an Intelligent Multi-objective Evolutionary Algorithm

Design of Nearest Neighbor Classifiers Using an Intelligent Multi-objective Evolutionary Algorithm Design of Nearest Neighbor Classifiers Using an Intelligent Multi-objective Evolutionary Algorithm Jian-Hung Chen, Hung-Ming Chen, and Shinn-Ying Ho Department of Information Engineering and Computer Science,

More information

Abstract. 1 Introduction

Abstract. 1 Introduction Shape optimal design using GA and BEM Eisuke Kita & Hisashi Tanie Department of Mechano-Informatics and Systems, Nagoya University, Nagoya 464-01, Japan Abstract This paper describes a shape optimization

More information

Extending MATLAB and GA to Solve Job Shop Manufacturing Scheduling Problems

Extending MATLAB and GA to Solve Job Shop Manufacturing Scheduling Problems Extending MATLAB and GA to Solve Job Shop Manufacturing Scheduling Problems Hamidullah Khan Niazi 1, Sun Hou-Fang 2, Zhang Fa-Ping 3, Riaz Ahmed 4 ( 1, 4 National University of Sciences and Technology

More information

V.Petridis, S. Kazarlis and A. Papaikonomou

V.Petridis, S. Kazarlis and A. Papaikonomou Proceedings of IJCNN 93, p.p. 276-279, Oct. 993, Nagoya, Japan. A GENETIC ALGORITHM FOR TRAINING RECURRENT NEURAL NETWORKS V.Petridis, S. Kazarlis and A. Papaikonomou Dept. of Electrical Eng. Faculty of

More information

MAXIMUM LIKELIHOOD ESTIMATION USING ACCELERATED GENETIC ALGORITHMS

MAXIMUM LIKELIHOOD ESTIMATION USING ACCELERATED GENETIC ALGORITHMS In: Journal of Applied Statistical Science Volume 18, Number 3, pp. 1 7 ISSN: 1067-5817 c 2011 Nova Science Publishers, Inc. MAXIMUM LIKELIHOOD ESTIMATION USING ACCELERATED GENETIC ALGORITHMS Füsun Akman

More information

GRANULAR COMPUTING AND EVOLUTIONARY FUZZY MODELLING FOR MECHANICAL PROPERTIES OF ALLOY STEELS. G. Panoutsos and M. Mahfouf

GRANULAR COMPUTING AND EVOLUTIONARY FUZZY MODELLING FOR MECHANICAL PROPERTIES OF ALLOY STEELS. G. Panoutsos and M. Mahfouf GRANULAR COMPUTING AND EVOLUTIONARY FUZZY MODELLING FOR MECHANICAL PROPERTIES OF ALLOY STEELS G. Panoutsos and M. Mahfouf Institute for Microstructural and Mechanical Process Engineering: The University

More information

NOVEL HYBRID GENETIC ALGORITHM WITH HMM BASED IRIS RECOGNITION

NOVEL HYBRID GENETIC ALGORITHM WITH HMM BASED IRIS RECOGNITION NOVEL HYBRID GENETIC ALGORITHM WITH HMM BASED IRIS RECOGNITION * Prof. Dr. Ban Ahmed Mitras ** Ammar Saad Abdul-Jabbar * Dept. of Operation Research & Intelligent Techniques ** Dept. of Mathematics. College

More information

Applied Cloning Techniques for a Genetic Algorithm Used in Evolvable Hardware Design

Applied Cloning Techniques for a Genetic Algorithm Used in Evolvable Hardware Design Applied Cloning Techniques for a Genetic Algorithm Used in Evolvable Hardware Design Viet C. Trinh vtrinh@isl.ucf.edu Gregory A. Holifield greg.holifield@us.army.mil School of Electrical Engineering and

More information

Monika Maharishi Dayanand University Rohtak

Monika Maharishi Dayanand University Rohtak Performance enhancement for Text Data Mining using k means clustering based genetic optimization (KMGO) Monika Maharishi Dayanand University Rohtak ABSTRACT For discovering hidden patterns and structures

More information

Some questions of consensus building using co-association

Some questions of consensus building using co-association Some questions of consensus building using co-association VITALIY TAYANOV Polish-Japanese High School of Computer Technics Aleja Legionow, 4190, Bytom POLAND vtayanov@yahoo.com Abstract: In this paper

More information

A Parallel Evolutionary Algorithm for Discovery of Decision Rules

A Parallel Evolutionary Algorithm for Discovery of Decision Rules A Parallel Evolutionary Algorithm for Discovery of Decision Rules Wojciech Kwedlo Faculty of Computer Science Technical University of Bia lystok Wiejska 45a, 15-351 Bia lystok, Poland wkwedlo@ii.pb.bialystok.pl

More information

Genetic Algorithm Performance with Different Selection Methods in Solving Multi-Objective Network Design Problem

Genetic Algorithm Performance with Different Selection Methods in Solving Multi-Objective Network Design Problem etic Algorithm Performance with Different Selection Methods in Solving Multi-Objective Network Design Problem R. O. Oladele Department of Computer Science University of Ilorin P.M.B. 1515, Ilorin, NIGERIA

More information

Multi-objective pattern and feature selection by a genetic algorithm

Multi-objective pattern and feature selection by a genetic algorithm H. Ishibuchi, T. Nakashima: Multi-objective pattern and feature selection by a genetic algorithm, Proc. of Genetic and Evolutionary Computation Conference (Las Vegas, Nevada, U.S.A.) pp.1069-1076 (July

More information

Artificial Intelligence

Artificial Intelligence Artificial Intelligence Informed Search and Exploration Chapter 4 (4.3 4.6) Searching: So Far We ve discussed how to build goal-based and utility-based agents that search to solve problems We ve also presented

More information

Dynamic Selection of Ensembles of Classifiers Using Contextual Information

Dynamic Selection of Ensembles of Classifiers Using Contextual Information Dynamic Selection of Ensembles of Classifiers Using Contextual Information Paulo R. Cavalin 1, Robert Sabourin 1, and Ching Y. Suen 2 1 École de Technologie Supérieure, 1100 Notre-dame ouest, Montreal(QC),

More information

Research Article Path Planning Using a Hybrid Evolutionary Algorithm Based on Tree Structure Encoding

Research Article Path Planning Using a Hybrid Evolutionary Algorithm Based on Tree Structure Encoding e Scientific World Journal, Article ID 746260, 8 pages http://dx.doi.org/10.1155/2014/746260 Research Article Path Planning Using a Hybrid Evolutionary Algorithm Based on Tree Structure Encoding Ming-Yi

More information

A New Exam Timetabling Algorithm

A New Exam Timetabling Algorithm A New Exam Timetabling Algorithm K.J. Batenburg W.J. Palenstijn Leiden Institute of Advanced Computer Science (LIACS), Universiteit Leiden P.O. Box 9512, 2300 RA Leiden, The Netherlands {kbatenbu, wpalenst}@math.leidenuniv.nl

More information

Neural Network Weight Selection Using Genetic Algorithms

Neural Network Weight Selection Using Genetic Algorithms Neural Network Weight Selection Using Genetic Algorithms David Montana presented by: Carl Fink, Hongyi Chen, Jack Cheng, Xinglong Li, Bruce Lin, Chongjie Zhang April 12, 2005 1 Neural Networks Neural networks

More information

A Memetic Heuristic for the Co-clustering Problem

A Memetic Heuristic for the Co-clustering Problem A Memetic Heuristic for the Co-clustering Problem Mohammad Khoshneshin 1, Mahtab Ghazizadeh 2, W. Nick Street 1, and Jeffrey W. Ohlmann 1 1 The University of Iowa, Iowa City IA 52242, USA {mohammad-khoshneshin,nick-street,jeffrey-ohlmann}@uiowa.edu

More information

Suppose you have a problem You don t know how to solve it What can you do? Can you use a computer to somehow find a solution for you?

Suppose you have a problem You don t know how to solve it What can you do? Can you use a computer to somehow find a solution for you? Gurjit Randhawa Suppose you have a problem You don t know how to solve it What can you do? Can you use a computer to somehow find a solution for you? This would be nice! Can it be done? A blind generate

More information

CHAPTER 6 REAL-VALUED GENETIC ALGORITHMS

CHAPTER 6 REAL-VALUED GENETIC ALGORITHMS CHAPTER 6 REAL-VALUED GENETIC ALGORITHMS 6.1 Introduction Gradient-based algorithms have some weaknesses relative to engineering optimization. Specifically, it is difficult to use gradient-based algorithms

More information

MODELLING DOCUMENT CATEGORIES BY EVOLUTIONARY LEARNING OF TEXT CENTROIDS

MODELLING DOCUMENT CATEGORIES BY EVOLUTIONARY LEARNING OF TEXT CENTROIDS MODELLING DOCUMENT CATEGORIES BY EVOLUTIONARY LEARNING OF TEXT CENTROIDS J.I. Serrano M.D. Del Castillo Instituto de Automática Industrial CSIC. Ctra. Campo Real km.0 200. La Poveda. Arganda del Rey. 28500

More information

A Unified Framework to Integrate Supervision and Metric Learning into Clustering

A Unified Framework to Integrate Supervision and Metric Learning into Clustering A Unified Framework to Integrate Supervision and Metric Learning into Clustering Xin Li and Dan Roth Department of Computer Science University of Illinois, Urbana, IL 61801 (xli1,danr)@uiuc.edu December

More information

Collaborative Rough Clustering

Collaborative Rough Clustering Collaborative Rough Clustering Sushmita Mitra, Haider Banka, and Witold Pedrycz Machine Intelligence Unit, Indian Statistical Institute, Kolkata, India {sushmita, hbanka r}@isical.ac.in Dept. of Electrical

More information

International Journal of Digital Application & Contemporary research Website: (Volume 1, Issue 7, February 2013)

International Journal of Digital Application & Contemporary research Website:   (Volume 1, Issue 7, February 2013) Performance Analysis of GA and PSO over Economic Load Dispatch Problem Sakshi Rajpoot sakshirajpoot1988@gmail.com Dr. Sandeep Bhongade sandeepbhongade@rediffmail.com Abstract Economic Load dispatch problem

More information

Fast Fuzzy Clustering of Infrared Images. 2. brfcm

Fast Fuzzy Clustering of Infrared Images. 2. brfcm Fast Fuzzy Clustering of Infrared Images Steven Eschrich, Jingwei Ke, Lawrence O. Hall and Dmitry B. Goldgof Department of Computer Science and Engineering, ENB 118 University of South Florida 4202 E.

More information

Using Genetic Algorithms to Solve the Box Stacking Problem

Using Genetic Algorithms to Solve the Box Stacking Problem Using Genetic Algorithms to Solve the Box Stacking Problem Jenniffer Estrada, Kris Lee, Ryan Edgar October 7th, 2010 Abstract The box stacking or strip stacking problem is exceedingly difficult to solve

More information

Preprocessing of Stream Data using Attribute Selection based on Survival of the Fittest

Preprocessing of Stream Data using Attribute Selection based on Survival of the Fittest Preprocessing of Stream Data using Attribute Selection based on Survival of the Fittest Bhakti V. Gavali 1, Prof. Vivekanand Reddy 2 1 Department of Computer Science and Engineering, Visvesvaraya Technological

More information

Using Decision Boundary to Analyze Classifiers

Using Decision Boundary to Analyze Classifiers Using Decision Boundary to Analyze Classifiers Zhiyong Yan Congfu Xu College of Computer Science, Zhejiang University, Hangzhou, China yanzhiyong@zju.edu.cn Abstract In this paper we propose to use decision

More information

Binary Representations of Integers and the Performance of Selectorecombinative Genetic Algorithms

Binary Representations of Integers and the Performance of Selectorecombinative Genetic Algorithms Binary Representations of Integers and the Performance of Selectorecombinative Genetic Algorithms Franz Rothlauf Department of Information Systems University of Bayreuth, Germany franz.rothlauf@uni-bayreuth.de

More information

Predicting Diabetes using Neural Networks and Randomized Optimization

Predicting Diabetes using Neural Networks and Randomized Optimization Predicting Diabetes using Neural Networks and Randomized Optimization Kunal Sharma GTID: ksharma74 CS 4641 Machine Learning Abstract This paper analysis the following randomized optimization techniques

More information

A Genetic k-modes Algorithm for Clustering Categorical Data

A Genetic k-modes Algorithm for Clustering Categorical Data A Genetic k-modes Algorithm for Clustering Categorical Data Guojun Gan, Zijiang Yang, and Jianhong Wu Department of Mathematics and Statistics, York University, Toronto, Ontario, Canada M3J 1P3 {gjgan,

More information

Hardware Neuronale Netzwerke - Lernen durch künstliche Evolution (?)

Hardware Neuronale Netzwerke - Lernen durch künstliche Evolution (?) SKIP - May 2004 Hardware Neuronale Netzwerke - Lernen durch künstliche Evolution (?) S. G. Hohmann, Electronic Vision(s), Kirchhoff Institut für Physik, Universität Heidelberg Hardware Neuronale Netzwerke

More information

GENETIC ALGORITHM VERSUS PARTICLE SWARM OPTIMIZATION IN N-QUEEN PROBLEM

GENETIC ALGORITHM VERSUS PARTICLE SWARM OPTIMIZATION IN N-QUEEN PROBLEM Journal of Al-Nahrain University Vol.10(2), December, 2007, pp.172-177 Science GENETIC ALGORITHM VERSUS PARTICLE SWARM OPTIMIZATION IN N-QUEEN PROBLEM * Azhar W. Hammad, ** Dr. Ban N. Thannoon Al-Nahrain

More information

An evolutionary annealing-simplex algorithm for global optimisation of water resource systems

An evolutionary annealing-simplex algorithm for global optimisation of water resource systems FIFTH INTERNATIONAL CONFERENCE ON HYDROINFORMATICS 1-5 July 2002, Cardiff, UK C05 - Evolutionary algorithms in hydroinformatics An evolutionary annealing-simplex algorithm for global optimisation of water

More information

Evolutionary Computation Algorithms for Cryptanalysis: A Study

Evolutionary Computation Algorithms for Cryptanalysis: A Study Evolutionary Computation Algorithms for Cryptanalysis: A Study Poonam Garg Information Technology and Management Dept. Institute of Management Technology Ghaziabad, India pgarg@imt.edu Abstract The cryptanalysis

More information

Job Shop Scheduling Problem (JSSP) Genetic Algorithms Critical Block and DG distance Neighbourhood Search

Job Shop Scheduling Problem (JSSP) Genetic Algorithms Critical Block and DG distance Neighbourhood Search A JOB-SHOP SCHEDULING PROBLEM (JSSP) USING GENETIC ALGORITHM (GA) Mahanim Omar, Adam Baharum, Yahya Abu Hasan School of Mathematical Sciences, Universiti Sains Malaysia 11800 Penang, Malaysia Tel: (+)

More information

Heuristic Optimisation

Heuristic Optimisation Heuristic Optimisation Revision Lecture Sándor Zoltán Németh http://web.mat.bham.ac.uk/s.z.nemeth s.nemeth@bham.ac.uk University of Birmingham S Z Németh (s.nemeth@bham.ac.uk) Heuristic Optimisation University

More information

Kernel Combination Versus Classifier Combination

Kernel Combination Versus Classifier Combination Kernel Combination Versus Classifier Combination Wan-Jui Lee 1, Sergey Verzakov 2, and Robert P.W. Duin 2 1 EE Department, National Sun Yat-Sen University, Kaohsiung, Taiwan wrlee@water.ee.nsysu.edu.tw

More information

Incorporation of Scalarizing Fitness Functions into Evolutionary Multiobjective Optimization Algorithms

Incorporation of Scalarizing Fitness Functions into Evolutionary Multiobjective Optimization Algorithms H. Ishibuchi, T. Doi, and Y. Nojima, Incorporation of scalarizing fitness functions into evolutionary multiobjective optimization algorithms, Lecture Notes in Computer Science 4193: Parallel Problem Solving

More information

What is GOSET? GOSET stands for Genetic Optimization System Engineering Tool

What is GOSET? GOSET stands for Genetic Optimization System Engineering Tool Lecture 5: GOSET 1 What is GOSET? GOSET stands for Genetic Optimization System Engineering Tool GOSET is a MATLAB based genetic algorithm toolbox for solving optimization problems 2 GOSET Features Wide

More information

Using Genetic Algorithms to optimize ACS-TSP

Using Genetic Algorithms to optimize ACS-TSP Using Genetic Algorithms to optimize ACS-TSP Marcin L. Pilat and Tony White School of Computer Science, Carleton University, 1125 Colonel By Drive, Ottawa, ON, K1S 5B6, Canada {mpilat,arpwhite}@scs.carleton.ca

More information

Genetic Algorithms Variations and Implementation Issues

Genetic Algorithms Variations and Implementation Issues Genetic Algorithms Variations and Implementation Issues CS 431 Advanced Topics in AI Classic Genetic Algorithms GAs as proposed by Holland had the following properties: Randomly generated population Binary

More information

Solving Sudoku Puzzles with Node Based Coincidence Algorithm

Solving Sudoku Puzzles with Node Based Coincidence Algorithm Solving Sudoku Puzzles with Node Based Coincidence Algorithm Kiatsopon Waiyapara Department of Compute Engineering, Faculty of Engineering, Chulalongkorn University, Bangkok, Thailand kiatsopon.w@gmail.com

More information

CHAPTER 4 GENETIC ALGORITHM

CHAPTER 4 GENETIC ALGORITHM 69 CHAPTER 4 GENETIC ALGORITHM 4.1 INTRODUCTION Genetic Algorithms (GAs) were first proposed by John Holland (Holland 1975) whose ideas were applied and expanded on by Goldberg (Goldberg 1989). GAs is

More information

Automated Test Data Generation and Optimization Scheme Using Genetic Algorithm

Automated Test Data Generation and Optimization Scheme Using Genetic Algorithm 2011 International Conference on Software and Computer Applications IPCSIT vol.9 (2011) (2011) IACSIT Press, Singapore Automated Test Data Generation and Optimization Scheme Using Genetic Algorithm Roshni

More information

Color-Based Classification of Natural Rock Images Using Classifier Combinations

Color-Based Classification of Natural Rock Images Using Classifier Combinations Color-Based Classification of Natural Rock Images Using Classifier Combinations Leena Lepistö, Iivari Kunttu, and Ari Visa Tampere University of Technology, Institute of Signal Processing, P.O. Box 553,

More information

Topological Machining Fixture Layout Synthesis Using Genetic Algorithms

Topological Machining Fixture Layout Synthesis Using Genetic Algorithms Topological Machining Fixture Layout Synthesis Using Genetic Algorithms Necmettin Kaya Uludag University, Mechanical Eng. Department, Bursa, Turkey Ferruh Öztürk Uludag University, Mechanical Eng. Department,

More information

Mining High Order Decision Rules

Mining High Order Decision Rules Mining High Order Decision Rules Y.Y. Yao Department of Computer Science, University of Regina Regina, Saskatchewan, Canada S4S 0A2 e-mail: yyao@cs.uregina.ca Abstract. We introduce the notion of high

More information

SSV Criterion Based Discretization for Naive Bayes Classifiers

SSV Criterion Based Discretization for Naive Bayes Classifiers SSV Criterion Based Discretization for Naive Bayes Classifiers Krzysztof Grąbczewski kgrabcze@phys.uni.torun.pl Department of Informatics, Nicolaus Copernicus University, ul. Grudziądzka 5, 87-100 Toruń,

More information

Center-Based Sampling for Population-Based Algorithms

Center-Based Sampling for Population-Based Algorithms Center-Based Sampling for Population-Based Algorithms Shahryar Rahnamayan, Member, IEEE, G.GaryWang Abstract Population-based algorithms, such as Differential Evolution (DE), Particle Swarm Optimization

More information

Exploration vs. Exploitation in Differential Evolution

Exploration vs. Exploitation in Differential Evolution Exploration vs. Exploitation in Differential Evolution Ângela A. R. Sá 1, Adriano O. Andrade 1, Alcimar B. Soares 1 and Slawomir J. Nasuto 2 Abstract. Differential Evolution (DE) is a tool for efficient

More information

Genetic Algorithms for Vision and Pattern Recognition

Genetic Algorithms for Vision and Pattern Recognition Genetic Algorithms for Vision and Pattern Recognition Faiz Ul Wahab 11/8/2014 1 Objective To solve for optimization of computer vision problems using genetic algorithms 11/8/2014 2 Timeline Problem: Computer

More information

International Journal of Scientific & Engineering Research Volume 8, Issue 10, October-2017 ISSN

International Journal of Scientific & Engineering Research Volume 8, Issue 10, October-2017 ISSN 194 Prime Number Generation Using Genetic Algorithm Arpit Goel 1, Anuradha Brijwal 2, Sakshi Gautam 3 1 Dept. Of Computer Science & Engineering, Himalayan School of Engineering & Technology, Swami Rama

More information

Design of an Optimal Nearest Neighbor Classifier Using an Intelligent Genetic Algorithm

Design of an Optimal Nearest Neighbor Classifier Using an Intelligent Genetic Algorithm Design of an Optimal Nearest Neighbor Classifier Using an Intelligent Genetic Algorithm Shinn-Ying Ho *, Chia-Cheng Liu, Soundy Liu, and Jun-Wen Jou Department of Information Engineering, Feng Chia University,

More information

Sparse Matrices Reordering using Evolutionary Algorithms: A Seeded Approach

Sparse Matrices Reordering using Evolutionary Algorithms: A Seeded Approach 1 Sparse Matrices Reordering using Evolutionary Algorithms: A Seeded Approach David Greiner, Gustavo Montero, Gabriel Winter Institute of Intelligent Systems and Numerical Applications in Engineering (IUSIANI)

More information

From dynamic classifier selection to dynamic ensemble selection Albert H.R. Ko, Robert Sabourin, Alceu Souza Britto, Jr.

From dynamic classifier selection to dynamic ensemble selection Albert H.R. Ko, Robert Sabourin, Alceu Souza Britto, Jr. From dynamic classifier selection to dynamic ensemble selection Albert H.R. Ko, Robert Sabourin, Alceu Souza Britto, Jr Eider Sánchez Contenidos 1. Introduction 2. Proposed dynamic ensemble selection KNORA

More information

Fuzzy Ant Clustering by Centroid Positioning

Fuzzy Ant Clustering by Centroid Positioning Fuzzy Ant Clustering by Centroid Positioning Parag M. Kanade and Lawrence O. Hall Computer Science & Engineering Dept University of South Florida, Tampa FL 33620 @csee.usf.edu Abstract We

More information

Artificial Intelligence Application (Genetic Algorithm)

Artificial Intelligence Application (Genetic Algorithm) Babylon University College of Information Technology Software Department Artificial Intelligence Application (Genetic Algorithm) By Dr. Asaad Sabah Hadi 2014-2015 EVOLUTIONARY ALGORITHM The main idea about

More information

Distributed Optimization of Feature Mining Using Evolutionary Techniques

Distributed Optimization of Feature Mining Using Evolutionary Techniques Distributed Optimization of Feature Mining Using Evolutionary Techniques Karthik Ganesan Pillai University of Dayton Computer Science 300 College Park Dayton, OH 45469-2160 Dale Emery Courte University

More information

AIRFOIL SHAPE OPTIMIZATION USING EVOLUTIONARY ALGORITHMS

AIRFOIL SHAPE OPTIMIZATION USING EVOLUTIONARY ALGORITHMS AIRFOIL SHAPE OPTIMIZATION USING EVOLUTIONARY ALGORITHMS Emre Alpman Graduate Research Assistant Aerospace Engineering Department Pennstate University University Park, PA, 6802 Abstract A new methodology

More information

CHAPTER 1 INTRODUCTION

CHAPTER 1 INTRODUCTION 1 CHAPTER 1 INTRODUCTION 1.1 OPTIMIZATION OF MACHINING PROCESS AND MACHINING ECONOMICS In a manufacturing industry, machining process is to shape the metal parts by removing unwanted material. During the

More information

CHAPTER 5 ANT-FUZZY META HEURISTIC GENETIC SENSOR NETWORK SYSTEM FOR MULTI - SINK AGGREGATED DATA TRANSMISSION

CHAPTER 5 ANT-FUZZY META HEURISTIC GENETIC SENSOR NETWORK SYSTEM FOR MULTI - SINK AGGREGATED DATA TRANSMISSION CHAPTER 5 ANT-FUZZY META HEURISTIC GENETIC SENSOR NETWORK SYSTEM FOR MULTI - SINK AGGREGATED DATA TRANSMISSION 5.1 INTRODUCTION Generally, deployment of Wireless Sensor Network (WSN) is based on a many

More information

Meta- Heuristic based Optimization Algorithms: A Comparative Study of Genetic Algorithm and Particle Swarm Optimization

Meta- Heuristic based Optimization Algorithms: A Comparative Study of Genetic Algorithm and Particle Swarm Optimization 2017 2 nd International Electrical Engineering Conference (IEEC 2017) May. 19 th -20 th, 2017 at IEP Centre, Karachi, Pakistan Meta- Heuristic based Optimization Algorithms: A Comparative Study of Genetic

More information

DE/EDA: A New Evolutionary Algorithm for Global Optimization 1

DE/EDA: A New Evolutionary Algorithm for Global Optimization 1 DE/EDA: A New Evolutionary Algorithm for Global Optimization 1 Jianyong Sun, Qingfu Zhang and Edward P.K. Tsang Department of Computer Science, University of Essex, Wivenhoe Park, Colchester, CO4 3SQ,

More information

LOW-DENSITY PARITY-CHECK (LDPC) codes [1] can

LOW-DENSITY PARITY-CHECK (LDPC) codes [1] can 208 IEEE TRANSACTIONS ON MAGNETICS, VOL 42, NO 2, FEBRUARY 2006 Structured LDPC Codes for High-Density Recording: Large Girth and Low Error Floor J Lu and J M F Moura Department of Electrical and Computer

More information

Investigating the Application of Genetic Programming to Function Approximation

Investigating the Application of Genetic Programming to Function Approximation Investigating the Application of Genetic Programming to Function Approximation Jeremy E. Emch Computer Science Dept. Penn State University University Park, PA 16802 Abstract When analyzing a data set it

More information

IEEE TRANSACTIONS ON EVOLUTIONARY COMPUTATION, VOL. 5, NO. 1, FEBRUARY

IEEE TRANSACTIONS ON EVOLUTIONARY COMPUTATION, VOL. 5, NO. 1, FEBRUARY IEEE TRANSACTIONS ON EVOLUTIONARY COMPUTATION, VOL. 5, NO. 1, FEBRUARY 2001 41 Brief Papers An Orthogonal Genetic Algorithm with Quantization for Global Numerical Optimization Yiu-Wing Leung, Senior Member,

More information

MODULE 6 Different Approaches to Feature Selection LESSON 10

MODULE 6 Different Approaches to Feature Selection LESSON 10 MODULE 6 Different Approaches to Feature Selection LESSON 10 Sequential Feature Selection Keywords: Forward, Backward, Sequential, Floating 1 Sequential Methods In these methods, features are either sequentially

More information

Segmentation of Noisy Binary Images Containing Circular and Elliptical Objects using Genetic Algorithms

Segmentation of Noisy Binary Images Containing Circular and Elliptical Objects using Genetic Algorithms Segmentation of Noisy Binary Images Containing Circular and Elliptical Objects using Genetic Algorithms B. D. Phulpagar Computer Engg. Dept. P. E. S. M. C. O. E., Pune, India. R. S. Bichkar Prof. ( Dept.

More information

Genetic Algorithms For Vertex. Splitting in DAGs 1

Genetic Algorithms For Vertex. Splitting in DAGs 1 Genetic Algorithms For Vertex Splitting in DAGs 1 Matthias Mayer 2 and Fikret Ercal 3 CSC-93-02 Fri Jan 29 1993 Department of Computer Science University of Missouri-Rolla Rolla, MO 65401, U.S.A. (314)

More information

The k-means Algorithm and Genetic Algorithm

The k-means Algorithm and Genetic Algorithm The k-means Algorithm and Genetic Algorithm k-means algorithm Genetic algorithm Rough set approach Fuzzy set approaches Chapter 8 2 The K-Means Algorithm The K-Means algorithm is a simple yet effective

More information

Cover Page. The handle holds various files of this Leiden University dissertation.

Cover Page. The handle   holds various files of this Leiden University dissertation. Cover Page The handle http://hdl.handle.net/1887/22055 holds various files of this Leiden University dissertation. Author: Koch, Patrick Title: Efficient tuning in supervised machine learning Issue Date:

More information

Neural Network Regularization and Ensembling Using Multi-objective Evolutionary Algorithms

Neural Network Regularization and Ensembling Using Multi-objective Evolutionary Algorithms Neural Network Regularization and Ensembling Using Multi-objective Evolutionary Algorithms Yaochu Jin Honda Research Institute Europe Carl-Legien-Str 7 Offenbach, GERMANY Email: yaochujin@honda-ride Tatsuya

More information

Leave-One-Out Support Vector Machines

Leave-One-Out Support Vector Machines Leave-One-Out Support Vector Machines Jason Weston Department of Computer Science Royal Holloway, University of London, Egham Hill, Egham, Surrey, TW20 OEX, UK. Abstract We present a new learning algorithm

More information

Clustering Analysis of Simple K Means Algorithm for Various Data Sets in Function Optimization Problem (Fop) of Evolutionary Programming

Clustering Analysis of Simple K Means Algorithm for Various Data Sets in Function Optimization Problem (Fop) of Evolutionary Programming Clustering Analysis of Simple K Means Algorithm for Various Data Sets in Function Optimization Problem (Fop) of Evolutionary Programming R. Karthick 1, Dr. Malathi.A 2 Research Scholar, Department of Computer

More information

Nuclear Research Reactors Accidents Diagnosis Using Genetic Algorithm/Artificial Neural Networks

Nuclear Research Reactors Accidents Diagnosis Using Genetic Algorithm/Artificial Neural Networks Nuclear Research Reactors Accidents Diagnosis Using Genetic Algorithm/Artificial Neural Networks Abdelfattah A. Ahmed**, Nwal A. Alfishawy*, Mohamed A. Albrdini* and Imbaby I. Mahmoud** * Dept of Comp.

More information

Genetic Algorithm for Finding Shortest Path in a Network

Genetic Algorithm for Finding Shortest Path in a Network Intern. J. Fuzzy Mathematical Archive Vol. 2, 2013, 43-48 ISSN: 2320 3242 (P), 2320 3250 (online) Published on 26 August 2013 www.researchmathsci.org International Journal of Genetic Algorithm for Finding

More information

Boosting Algorithms for Parallel and Distributed Learning

Boosting Algorithms for Parallel and Distributed Learning Distributed and Parallel Databases, 11, 203 229, 2002 c 2002 Kluwer Academic Publishers. Manufactured in The Netherlands. Boosting Algorithms for Parallel and Distributed Learning ALEKSANDAR LAZAREVIC

More information