Genetic algorithm and forward method for feature selection in EEG feature space
|
|
- Lester Baldwin
- 6 years ago
- Views:
Transcription
1 Journal of Theoretical and Applied Computer Science Vol. 7, No. 2, 2013, pp ISSN (printed), (online) Genetic algorithm and forward method for feature selection in EEG feature space Izabela Rejer, Krzysztof Lorenz Faculty of Computer Science and Information Technology, West Pomeranian University of Technology, Szczecin, Poland {irejer, Abstract: There are a lot of problems that arise in the process of building a brain-computer interface based on electroencephalographic signals (EEG). A huge imbalance between a number of experiments possible to conduct and the size of feature space, containing features extracted from recorded signals, is one of them. To reduce this imbalance, it is necessary to apply methods for feature selection. One of the approaches for feature selection, often taken in brain-computer interface researches, is a classic genetic algorithm that codes all features within each individual. In this study, there will be shown, that although this approach allows obtaining a set of features of high classification precision, it also leads to a feature set highly redundant comparing to a set of features selected using a forward selection method or a genetic algorithm equipped with individuals of a given (very small) number of genes. Keywords: Brain computer interface, electroencephalography, feature selection, genetic algorithm, forward selection, SVM classifier 1. Introduction One of the most important issues during designing a brain-computer interface is to provide high accuracy of the classifier which task is to decide to which of the predefined classes the currently recorded EEG signal should be assigned. Wrong diagnosis result in executing improper actions. While such mistakes are sometimes allowed when the interface is used to support user communication with the external world (they only slow down the communication), they must not happen when the BCI is used as a system controlling actions of real devices since they could be dangerous for the interface's user. Hence, the basic direction of the research on BCI systems is looking for methods and techniques which increase the classification accuracy. The classification precision depends on many decisions that are taken at succeeding stages of the process of building a BCI system, from feature extraction and selection to the choice of a classifier and its parameters. It is difficult to decide which of these stages are the most important for the final quality of the classification since mistakes made on any of them can result in a decrease in the classification accuracy. However, some authors point out that two of them demand the special attention, i.e. feature extraction and selection [1][2]. While the research on the algorithms for extracting features from raw EEG signals are often conducted and reported in the scientific literature, surprisingly small number of papers touches the problem of designing efficient algorithms for feature selection [3][4]. Meanwhile, the feature selection is do important since it allows reducing the dimensionality of the
2 Genetic algorithm and forward selection for feature selection in EEG feature space 73 analyzed data set [5]. Due to smaller number of features, not only is the interface faster, but also more convenient for the user. Of course, the speed of the interface is not important during the construction stage that is taken mostly in an off-line mode, however, it is extremely important during a daily use of a device, where the signal detection has to be made immediately which leaves no time to extract unnecessary features. The problem of feature selection is not a trivial one, because the dimension of the feature space that contain features extracted from raw EEG signals is often very large ( features). Such a large space of potential features excludes the possibility of an exhaustive search of the feature space, which is the only procedure for discovering the optimal subset of features. Instead, in order to make an enormous reduction of up to 99% of potential features, one of the heuristic feature selection methods discussed in the scientific literature [5][6] has to be applied. One such feature selection method, which is frequently applied within the BCI domain, is a genetic algorithm [3][7][8]. Its main advantage is that during the exploration of the space of possible solutions, it does not evaluate solutions one by one, but evaluates a set of solutions simultaneously. Moreover, it is not prone to get stuck at local minima and it does not require assumptions about the interactions between features [4]. Theoretically, different approaches to genetic algorithms can be used in the feature selection process (e.g. Culling algorithm [7][10], algorithm with limited number of features [8], algorithm with aggressive mutation [11], etc.). In practice, however, mostly the simplest approach is used, i.e. the approach with binary individuals coding all features existing in the problem at hand. In this approach each gene of an individual holds information about whether the corresponding feature is present in the individual (allele 1) or not (allele 0) [3][12]. The fitness function is usually a pure classification accuracy. The mentioned approach is easy to implement, however, it has one weakness: by using pure classification accuracy as a fitness function, the optimization process is directed at creating individuals ensuring 100% of classification accuracy, regardless of the number of features comprised in these individuals. Since creating an individual with a larger number of features means also that a corresponding classifier has a larger number of free parameters, the algorithm that maximizes classification accuracy has no possibility to explore solutions that code essentially smaller number of features than solutions from the initial population. Hence, when a genetic algorithm is to be used in the feature selection process, more sophisticated approaches to coding individuals or more sophisticated fitness functions should be chosen. Another feature selection method, popular within the domain of pattern recognition but rather rarely used in case of EEG feature sets, is a step-selection method. A step-selection means that features are added to/removed from a feature set one by one in succeeding steps of the survey. At each step this feature is added to/removed from a feature set which is the best/worst from the point of view of a given criterion. The aim of this study is to examine whether in case of BCI data sets, the application of a step-selection algorithm allows finding the feature subset of a higher classification accuracy than the feature subset found using a genetic algorithm that processes individuals coding all features from the analyzed feature space. Additionally, results returned by the algorithm of a stepwise forward selection will be compared with results returned by the genetic algorithm that processes individuals containing a limited number of genes. Hence, the paper should answer three questions: Is the accuracy of the classifier built over feature set returned by the feed-forward selection method higher than the accuracy of the classifier built over the feature set returned by the classic genetic algorithm?
3 74 Izabela Rejer, Krzysztof Lorenz Is the quality of the first classifier better that the second one? Do the change of the coding method of genetic algorithm correspond to essential changes in the quality of the final feature set? The research will be carried out with a data set that was submitted to the second BCI Competition (data set III motor imaginary) by Department of Medical Informatics, Institute for Biomedical Engineering, Graz University of Technology [13]. The quality of feature sets returned by all three methods will be compared in terms of classification accuracy obtained with SVM classifiers using selected features. 2. Applied methods 2.1. Genetic algorithm A genetic algorithm (GA) is one of the heuristic methods for solving the optimization problems. Nowadays, this method is used in many different research fields, but it originates from genetic sciences. Hence, most terms that are used to describe the optimization problems are inherited directly from biological terms. To run the optimization process with a GA, first, the problem environment has to be defined, i.e.: a method of coding problem solutions to the form of GA individuals, a fitness function that is used for evaluating individuals in each generation, genetic operations that are used for mixing and modifying individuals, a method for selecting individuals and other additional GA parameters. After defining the problem environment, the chosen genetic algorithm is applied to process individuals by a given number of generations. The general scheme of the classic genetic algorithm, i.e. Holland algorithm from 1975 [11], can be described as follows. In the initial stage of the algorithm a set of randomly selected individuals, each coding one solution of the problem at hand, is created. The individuals are ranked according to the chosen criterion, given in the form of a fitness function. Next, the solutions of too small values of the fitness function are removed from the set of solutions (selection stage). Their places are taken by new solutions, that are created by combining together parts of solutions of high fitness (crossover stage). From time to time random changes are made in existing solutions. These changes allow the algorithm exploring entirely new areas of the problem domain (mutation stage). The entire process is repeated, until the satisfactory solution is found. As mentioned in Section 1, different genetic algorithms can be used for feature selection. From all of these algorithms, the most commonly used is the algorithm that processes individuals coding all extracted features. According to this approach, each gene of an individual corresponds to one feature and carries the information whether this feature is present in the given individual (allele one), or not (allele zero). The algorithm is mostly compatible with the classic Holland algorithm which means that it starts from random population and it applies one-gene mutation and one-point crossover. The quality of individuals created in succeeding iterations is evaluated according to the accuracy of classifiers built separately for each of the individuals. As a result of the random selection of genes to the individuals of the initial population, each individual contains approximately half of all the possible features (assuming the uniform distribution). In case of a space composed of features extracted from raw EEG signals (contained usually at least features), starting a process of searching for the optimal set of features from the middle of this set is not a profitable solution because it can disable significant reduction of features from this set. This is a result of directing the optimization process to maximize the classification accuracy which favours individuals of higher accura-
4 Genetic algorithm and forward selection for feature selection in EEG feature space 75 cy, i.e. individuals that code solutions generating classifiers of higher number of free parameters (and so, higher number of features). Theoretically, the optimization process has not to be guided purely by the classifier results. It is possible, for example, to equip the GA fitness function with a penalty term, which will penalize individuals coding too large number of features. It is also possible to develop some specialized genetic operators converting such unwelcome individuals to individuals carrying smaller number of features. In practice, however, the scale of the required reduction of the feature set is so large that it is extremely difficult to develop the stable function penalizing individuals that carry too many features or functions for converting these individuals. A much better solution is to run GA with individuals of limited number of features, coding only small subsets of the whole feature set. In order to apply such solution, some changes in the genotype have to be made. First of all, there is no need to stick to binary coding, a much better solution is to use integer genes. Secondly, each gene should encode an index of one feature from the whole set of features. With such an approach, one individual contains indexes of features that should be delivered to the classifier inputs. The appropriate number of features (i.e. the number of genes contained in one individual) is set by the user before launching the algorithm, with respect to the number of recorded observations and applied classifier. When such a coding method is applied, an individual with two or more equal genes can appear as a result of genetic operations or as a result of a random selection of initial population. In some applications, e.g. in the travelling salesman problem, such an individual, indicating a double-visit in the same city, would be discarded as an incorrect one. However, in case of the feature selection problem, guided by the classification precision, such an individual is not considered to be defective - on the contrary, it may even be desirable. Of course it has to be repaired because multiple usage of the same feature in the classifier does not make sense, but this repair involves only removing all but one of the genes coding this feature. Why such an individual is desirable? Because if the accuracy of the classifier using features coded in this individual was sufficiently high to allow this individual surviving the selection process, that would indicate that a further reduction of the number of features is possible Stepwise selection Stepwise selection methods are heuristic methods defining the overall strategy of the process of searching the space of possible solutions. There are three main types of stepselection methods: forward selection, backward selection and bidirectional selection. The only difference between them is a search direction. In case of the first method, searching process starts with an empty set of features. Next, this set is gradually extended by adding one feature at each step of the procedure. To decide which feature should be added to the set, all features are sorted with respect to a selected criterion, and the best one is chosen. When the selection process is guided by the classifier, the selection criterion is usually the classification accuracy. The whole process ends when none of the remaining features is able to cause a further increase in the classifier accuracy. In case of the second method (backward selection), the selection process starts with a set containing all possible features which are one by one discarded from this set in succeeding steps of the searching process. Of course, this time this feature is eliminated from the feature set which causes the smallest (or none) improvement in the classifier accuracy. The last method is a simple mix of forward and backward selection, in which both elementary strategies are used alternately. The decision whether to use a forward or backward selection is
5 76 Izabela Rejer, Krzysztof Lorenz always determined by the characteristics of a given data set. In case of EEG feature space the forward step-selection is the only choice because of a limited number of observations and hence a limited number of features possible to introduce to a classifier Support Vector Machine SVM Support Vector Machine method (SVM), developed by Vapnik, is fundamentally a method for binary classification, although it can be adapted for multi-class classification with use of special strategies e.g. the strategy one-versus-rest or one-versus-one [14]. The basic idea of SVM algorithm is to find such hyperplane separating two classes, which correctly classifies new observations and which guarantees that the margin separating both classes is the largest one. In order to find the non-linear decision boundary, in case of linearly non-separable classes, the training vectors are mapped into a higher dimension space using kernel functions (mostly Gaussian or radial function). The SVM classifier uses discrimination function : of the form, where is a vector of evaluations of kernel functions centred at the support vectors which are usually a subset of the training vectors, w is a vector of weights, and b is the bias [15]. The class x is assigned by considering the sign of f(x). 3. Experiment setup 3.1. Feature extraction The research was performed with a dat set that was submitted to the second BCI Competition (data set III motor imaginary) by Department of Medical Informatics, Institute for Biomedical Engineering, Graz University of Technology. The data set was recorded from a normal subject (female, 25y) whose task was to control the movements of a feedback bar by means of imagery movements of the left and right hand. Cues informing about the direction in which the feedback bar should be moved were displayed on a screen in the form of the left and right arrows. The order of left and right cues was random. The experiment consisted of 280 trials, each trial lasted 9 seconds. The first two seconds were quiet, at t=2s an acoustic stimulus was generated and a cross + was displayed for one second; then at t=3s, an arrow (left or right) was displayed as a cue. The EEG signals were measured over three bipolar EEG channels (C3, Cz and C4), sampled with 128Hz and preliminarily filtered between 0.5 and 30Hz. The whole data set, containing data from 280 trials, was then divided into two equal subsets the first one intended for classifier training and the second intended for external classifier test. Since only data from the first subset was published with target values (1 - left hand, 2- right hand), only this subset could be used in the process of classifiers training and testing. In the feature extraction process, the original data set was transformed to a set of 144 features, representing the band power calculated separately for: 12 frequency bands: alpha band (8-13Hz) and five sub-bands of the alpha band (8-9 Hz; 9-10 Hz; Hz; Hz; Hz); beta band (13-30Hz) and the five sub-bands of the beta band (13-17 Hz; Hz; Hz; Hz; Hz), 6 seconds: 4, 5, 6, 7, 8 and 9 (data from seconds 1-3 were discarded because they covered the period before the clue presentation), 2 channels: C3 and C4 (channel Cz was omitted, because of its location above the interhemispheric fissure, it is just outside the main brain areas responsible for hand movement (both real and imaginary)).
6 Genetic algorithm and forward selection for feature selection in EEG feature space Feature selection Due to a very large number of extracted features, even greater than the number of observations, in the next stage of the research, feature selection procedure was undertaken. Three approaches, described in Section 2 were used in the selection process: forward selection, genetic algorithm processing individuals coding all the extracted features (genetic algorithm 1) and genetic algorithm processing individuals containing only several genes (genetic algorithm 2). In order to apply the forward selection method, the whole experiment had to be divided into stages. At each stage a set of classifiers was built - one classifier per each feature from the set of remaining features. This feature was passed to the next stage, which was used in the classifier of the highest accuracy. The same feature was also deleted from the original set of features. The following sets of classifiers were built at the succeeding stages of the experiment: stage I 144 one-input classifiers, stage II 143 two-input classifiers (first input of each classifier was the feature chosen at the first stage and the second input was one of the remaining features), stage III 142 three-input classifiers (two inputs of each classifier were features chosen at the previous stage and the last input was one of the remaining features) stage IV 141 four-input classifiers (according to the same scheme), stage V 140 five-input classifiers (according to the same scheme), stage VI 139 six-input classifiers (according to the same scheme). The feature selection process was stopped at the sixth stage because none of the remaining features, after adding to the six-input classifier, did not induce a further increase in the classification accuracy. In contrast to the six stages of forward selection, the genetic algorithm 1 was run at once for the full array of features. The basic parameters of the algorithm were established at the following levels: number of individuals: 50, number of chromosomes per individual: 1, number of genes per chromosome: 144, number of generations: 100. Each individual represented one of the possible solutions - it is one subset of features. Features were coded with binary genes: gene value equal to one meant that the feature corresponding to the gene index was present in the solution, gene value equal to zero meant that the feature was not present in the solution [12]. The quality of individuals was evaluated with SVM classifier. The initial population was chosen randomly. The selection was conducted with tournament method (with a probability of 0.75). Two classic genetic operations were used: one-point crossover (with a probability of 0.8) and one-gene mutation (with a probability of 0.025). All parameters of the genetic algorithm 2, apart from the coding method, were set according to the same scheme as in the genetic algorithm 1. Referring to the coding method, the number of genes per individual was set to six, and each gene could take one of 144 integer values, corresponding to indexes of features from the feature space. The reason for setting the number of genes to six was an opinion, widely cited in the scientific literature, according to which in order to train the classifier correctly, at least ten observations per class and per input dimension should be gathered [16]. Since the data set that was used in the research contained only 140 observations, the maximal number of features that could be used in the classifier without a threat of overfitting and assuming a linear classification was seven. However, since some observations were needed for the testing process, this number was additionally decreased and finally it was set to 6.
7 78 Izabela Rejer, Krzysztof Lorenz 4. Classification Due to a very adverse ratio of features to observations, a classic linear SVM method was used in the classification process. The classification threshold was set to 0.5 and hence, all classifier results greater than 0.5 were classified as class 2 (right hand) and results smaller or equal to 0.5 were classified as class 1 (left hand). The classifiers accuracy was tested with 10-fold cross-validation. The final accuracy measure of a given feature set was the mean value calculated from classification accuracy obtained for all validation sets. The accuracy of a single validation set was calculated according to the following formula:, (1) where: A k accuracy of k validation subset (k=1...10), R k number of correctly classified observations from k validation subset, U k number of all observations in k validation subset. 5. Results 5.1. Genetic algorithm 1 At first, the genetic algorithm 1 was run. The average accuracy of the randomly chosen initial population was 75.38%. The best individual contained 66 features and its classification accuracy was equal to 88.46%. After 100 generations the average accuracy of the population increased to 87.69%. The best individual of the final population contained 74 features and its classification accuracy was equal to 96.15% (features contained in this individual are presented in Table 1). Table 1. A set of 74 features chosen by the genetic algorithm 1; gray cells indicate features encoded in the best individual of the classification accuracy equal to 96.15%. Frequency band [Hz] channel second C3 C4
8 Genetic algorithm and forward selection for feature selection in EEG feature space Forward selection method In the second stage of the research the forward selection method was used. The results are gathered in Table 2. Succeeding columns of the table presents: stage number, indexes of features used in the classifier of the highest classification precision, number of classifiers built at the stage, and the classifier accuracy calculated according to (1). Features added to the feature set at succeeding stages of the selection process were as follows: stage I - feature no the power within the Hz frequency band, calculated over second 5 and channel C4; stage II - feature no band: Hz, second: 5, channel: C3; stage III - feature no band: Hz, second: 5, channel: C3; stage IV - feature no. 9 - band: 8 13 Hz, second: 6, channel: C4; stage V - feature no. 3 - band: 8 13 Hz, second: 6, channel: C3; stage VI - feature no band: Hz, second: 8, channel: C4; Table 2. The accuracy of the classifiers containing features chosen in subsequent stages of the forward selection method Stage Indexes of features Number of classifiers Classifier accuracy [%] I II III IV V VI Genetic algorithm 2 Finally, the genetic algorithm 2 was applied in the research. The number of genes per individual was set to six. The average classification accuracy of the randomly chosen initial population was equal to 61.96%. The best individual from the initial population was characterized by the classification accuracy of 75.00%. After completing the algorithm, the average accuracy of the population increased to 81.57%. The best individual from the final population achieved the classification accuracy equal to 91.43% and contained the following features: feature no. 2 - band power in 8 13 Hz band, calculated over second 5 and channel C3; feature no. 3 - band power in 8-13 Hz band, calculated over second 6 and channel C3; feature no. 8 - band power in 8-13 Hz band, calculated over second 5 and channel C4; feature no. 9 - band power in 8-13 Hz band, calculated over second 6 and channel C4; feature no band power in 8-13 Hz band, calculated over second 8 and channel C4; feature no band power in Hz band, calculated over second 3 and channel C3. 6. Discussion Comparing pure classification accuracy obtained in all three experiments, it can be stated that the higher accuracy was obtained from the classifier whose input features were selected with the genetic algorithm 1. Accuracy of this classifier 96.15%, was of 3.54% higher than the accuracy of the classifier using features returned by the stepwise method
9 80 Izabela Rejer, Krzysztof Lorenz (92.86%) and of 4.72% higher that the accuracy of the classifier using features returned by the genetic algorithm 2 (91.43%). However, the high accuracy of the classifier built in the first experiment should not be identified with the high quality of this classifier. When comparing classifiers for BCIs, not only should the difference in the accuracy be taken into consideration, but also the reliability of the results and possibility of using them in on-line mode in real brain-computer interfaces. Discussing the results from this point of view, the number of features that were used in each classifier had to be taken into consideration. According to S. Raudys and A. Jain [16], the number of observations needed to properly train the classifier should be equal at least 10 observations per input dimension and per output class. The classifier that was built over the feature set selected by the genetic algorithm 1 does not meet this requirement. The number of observations per input feature and output class is, in its case, much less than this. To be precise, this number is even less than one (74 input variables and 2 classes against 126 observations in each of 10 training sets). With such unfavourable relation of observations to input features, there is a high probability that the classifier adapted to irrelevant details of succeeding training sets that were used in the cross-validation process, instead of generating the classification surface generalizing the whole spectrum of observations. The ratio of observations to input variables was totally different for classifiers built over feature sets returned by the forward selection method and the genetic algorithm 2. This time, the number of input features in each classifier was equal to 6, which means that there was even slightly more than 10 observations per feature and per class (126 observations, 6 features and 2 classes). Hence, the classifiers that were built using both approaches should be regarded as much more reliable than the classifier built using the genetic algorithm 1. Leaving aside the probable lack of generalisation capabilities of the classifier trained on the basis of 126 observations and equipped with 74 features, a clear disadvantage of the classifier built with the first approach, is a very small reduction of the feature space (the reduction rate is about 50%). Two other methods allowed achieving a much higher reduction rate (equal to 96%) with only slightly lower classification accuracy. As it was emphasized in Section 1, when a brain-computer interface is built, a tendency is to use as small number of features as possible (due to the costs, interface speed and comfort of its application). From this reason, selection methods aimed at choosing a feature set composed of a limited number of features are more adequate than methods without such limits. Finally, comparing results obtained with both methods based on limited subsets features, it is very difficult to determine which of them gave better results. Therefore, in view of the fact that both of them return only sub-optimal feature subsets, the best practice is to use both methods and to select this feature subset which allow building the classifier of a higher classification accuracy. Summarizing Discussion Section, the answers for three questions posed in Introduction should be as follows: Question 1. The accuracy of the classifier built over feature set returned by the feedforward selection method is the same or slightly lower than the accuracy of the classifier built over the feature set returned by the classic genetic algorithm. Question 2. The quality of the first classifier is much better that the second one. The quality is measured here by the final reduction rate. Question 3. The application of the coding method described in this paper causes essential changes in the quality of the final feature set. Obviously, the answers given above refers strictly to the data set that was used in the survey. In order to draw general conclusions much more data sets should be analyzed.
10 Genetic algorithm and forward selection for feature selection in EEG feature space Conclusion Summarizing the research described in the paper, it has to be concluded, that in case of data sets that are characterized by too low ratio of observations to features, selection methods which start the searching process from the limited features collections are a good choice. Another approach, i.e. starting the selection process using either the full set of features or half the features from this set, may seriously limit the final reduction rate. References [1] Pfurtscheller G., Flotzinger D., Kalcher J. Brain-computer interface-a new communication device for handicapped persons. Journal of Microcomputer Application, Vol. 16, No. 3, 1993, pp [2] Hammon P.S., de Sa V.R. Preprocessing and meta-classification for brain-computer interfaces. IEEE Transactions on Biomedical Engineering, Vol. 54, No. 3, 2007, pp [3] Peterson D. A., Knight J. N., Kirby M. J., Anderson Ch. W., Thaut M. H. Feature Selection and Blind Source Separation in an EEG-Based Brain-Computer Interface. EURASIP Journal on Applied Signal Processing, No. 19, 2005, pp [4] Lakany H., Conway B. A. Understanding intention of movement from electroencephalograms. Expert Systems, Vol. 24, No 5, 2007, pp [5] Koprinska I. Feature Selection for Brain-Computer Interfaces. T. Theeramunkong et al. (Eds.): PAKDD Workshops 2009, LNAI No. 5669, 2010, pp , Springer- Verlag Berlin Heidelberg, 2010 [6] Dias N.S., Kamrunnahar M., Mendes P.M., Schiff S.J., Correia J.H. Feature selection on movement imagery discrimination and attention detection. Med Biol Eng Comput, Vol. 48, No. 4, 2010, pp [7] Yom-Tov E., Inbar G. F. Feature Selection for the Classification of Movements From Single Movement-Related Potentials. IEEE Transactions on Neural Systems and Rehabilitation Engineering, Vol. 10, No. 3, 2012, pp [8] Kołodziej M., Majakowski A., Rak J. R. A new Method of EEG Classification for BCI with Feature Extraction Based on Higher Order Statistics of Wavelet Components and Selection with Genetic Algorithms. ICANNGA 2011, Part I LNCS 6593, pp , Springer-Verlag Berlin, 2011 [9] Michalewicz Z. Genetic algorithms + data structures = evolutionary programs. Scientific and Technique Publishing House, Warsaw 1995 [10] Koller D., Sahami M., Toward optimal feature selection, Proc. Machine Learning, pp , 1996 [11] Rejer I. Genetic Algorithms in EEG Feature Selection for the Classification of Movements of the Left and Right Hand. Advances in Intelligent and Soft Computing, Springer, 2013 [12] Garrett D., Peterson D. A., Anderson Ch., Thaut M. H. Comparison of Linear, Nonlinear, and Feature Selection Methods for EEG Signal Classification. IEEE Transactions on Neural Systems and Rehabilitation Engineering, Vol. 11, No. 2, 2003, pp [13] Data set III, II BCI Competition, motor imaginary [dostęp: 2013] [14] Schlögl A., Lee F., Bischof H., Pfurtscheller G. Characterization of four-class motor imagery EEG data for the BCI-competition Journal of Neural Engineering, Vol. 2, No 4, 2005, pp
11 82 Izabela Rejer, Krzysztof Lorenz [15] Vapnik V., Statistical Learning Theory, New York: Wiley, 1998 [16] Jain A.K., Duin R.P.W., Mao J. A Review, Statistical Pattern Recognition, IEEE Transactions on Pattern Analysis and Machine Intelligence, Vol. 22, No. 1, 2000, pp. 4-37
Independent Component Analysis for EEG Data Preprocessing - Algorithms Comparison
Independent Component Analysis for EEG Data Preprocessing - Algorithms Comparison Izabela Rejer and Paweł Górski West Pomeranian University of Technology, Szczecin, Faculty of Computer Science, Zolnierska
More informationIndependent Component Analysis for EEG Data Preprocessing - Algorithms Comparison
Independent Component Analysis for EEG Data Preprocessing - Algorithms Comparison Izabela Rejer, Pawel Górski To cite this version: Izabela Rejer, Pawel Górski. Independent Component Analysis for EEG Data
More informationTime Complexity Analysis of the Genetic Algorithm Clustering Method
Time Complexity Analysis of the Genetic Algorithm Clustering Method Z. M. NOPIAH, M. I. KHAIRIR, S. ABDULLAH, M. N. BAHARIN, and A. ARIFIN Department of Mechanical and Materials Engineering Universiti
More informationEstimating Noise and Dimensionality in BCI Data Sets: Towards Illiteracy Comprehension
Estimating Noise and Dimensionality in BCI Data Sets: Towards Illiteracy Comprehension Claudia Sannelli, Mikio Braun, Michael Tangermann, Klaus-Robert Müller, Machine Learning Laboratory, Dept. Computer
More informationNeural Networks. CE-725: Statistical Pattern Recognition Sharif University of Technology Spring Soleymani
Neural Networks CE-725: Statistical Pattern Recognition Sharif University of Technology Spring 2013 Soleymani Outline Biological and artificial neural networks Feed-forward neural networks Single layer
More informationFeature Selection. CE-725: Statistical Pattern Recognition Sharif University of Technology Spring Soleymani
Feature Selection CE-725: Statistical Pattern Recognition Sharif University of Technology Spring 2013 Soleymani Outline Dimensionality reduction Feature selection vs. feature extraction Filter univariate
More informationThe Genetic Algorithm for finding the maxima of single-variable functions
Research Inventy: International Journal Of Engineering And Science Vol.4, Issue 3(March 2014), PP 46-54 Issn (e): 2278-4721, Issn (p):2319-6483, www.researchinventy.com The Genetic Algorithm for finding
More informationMODULE 6 Different Approaches to Feature Selection LESSON 10
MODULE 6 Different Approaches to Feature Selection LESSON 10 Sequential Feature Selection Keywords: Forward, Backward, Sequential, Floating 1 Sequential Methods In these methods, features are either sequentially
More informationGenetic Algorithms Variations and Implementation Issues
Genetic Algorithms Variations and Implementation Issues CS 431 Advanced Topics in AI Classic Genetic Algorithms GAs as proposed by Holland had the following properties: Randomly generated population Binary
More informationEvolving SQL Queries for Data Mining
Evolving SQL Queries for Data Mining Majid Salim and Xin Yao School of Computer Science, The University of Birmingham Edgbaston, Birmingham B15 2TT, UK {msc30mms,x.yao}@cs.bham.ac.uk Abstract. This paper
More informationSelection of Location, Frequency and Orientation Parameters of 2D Gabor Wavelets for Face Recognition
Selection of Location, Frequency and Orientation Parameters of 2D Gabor Wavelets for Face Recognition Berk Gökberk, M.O. İrfanoğlu, Lale Akarun, and Ethem Alpaydın Boğaziçi University, Department of Computer
More informationIntroduction to Genetic Algorithms. Based on Chapter 10 of Marsland Chapter 9 of Mitchell
Introduction to Genetic Algorithms Based on Chapter 10 of Marsland Chapter 9 of Mitchell Genetic Algorithms - History Pioneered by John Holland in the 1970s Became popular in the late 1980s Based on ideas
More information1. Introduction. 2. Motivation and Problem Definition. Volume 8 Issue 2, February Susmita Mohapatra
Pattern Recall Analysis of the Hopfield Neural Network with a Genetic Algorithm Susmita Mohapatra Department of Computer Science, Utkal University, India Abstract: This paper is focused on the implementation
More informationMeta- Heuristic based Optimization Algorithms: A Comparative Study of Genetic Algorithm and Particle Swarm Optimization
2017 2 nd International Electrical Engineering Conference (IEEC 2017) May. 19 th -20 th, 2017 at IEP Centre, Karachi, Pakistan Meta- Heuristic based Optimization Algorithms: A Comparative Study of Genetic
More informationTelecommunication and Informatics University of North Carolina, Technical University of Gdansk Charlotte, NC 28223, USA
A Decoder-based Evolutionary Algorithm for Constrained Parameter Optimization Problems S lawomir Kozie l 1 and Zbigniew Michalewicz 2 1 Department of Electronics, 2 Department of Computer Science, Telecommunication
More informationPreprocessing of Stream Data using Attribute Selection based on Survival of the Fittest
Preprocessing of Stream Data using Attribute Selection based on Survival of the Fittest Bhakti V. Gavali 1, Prof. Vivekanand Reddy 2 1 Department of Computer Science and Engineering, Visvesvaraya Technological
More informationThe Comparative Study of Machine Learning Algorithms in Text Data Classification*
The Comparative Study of Machine Learning Algorithms in Text Data Classification* Wang Xin School of Science, Beijing Information Science and Technology University Beijing, China Abstract Classification
More informationSuppose you have a problem You don t know how to solve it What can you do? Can you use a computer to somehow find a solution for you?
Gurjit Randhawa Suppose you have a problem You don t know how to solve it What can you do? Can you use a computer to somehow find a solution for you? This would be nice! Can it be done? A blind generate
More informationHYBRID GENETIC ALGORITHM WITH GREAT DELUGE TO SOLVE CONSTRAINED OPTIMIZATION PROBLEMS
HYBRID GENETIC ALGORITHM WITH GREAT DELUGE TO SOLVE CONSTRAINED OPTIMIZATION PROBLEMS NABEEL AL-MILLI Financial and Business Administration and Computer Science Department Zarqa University College Al-Balqa'
More informationRegression Test Case Prioritization using Genetic Algorithm
9International Journal of Current Trends in Engineering & Research (IJCTER) e-issn 2455 1392 Volume 2 Issue 8, August 2016 pp. 9 16 Scientific Journal Impact Factor : 3.468 http://www.ijcter.com Regression
More informationEfficient Case Based Feature Construction
Efficient Case Based Feature Construction Ingo Mierswa and Michael Wurst Artificial Intelligence Unit,Department of Computer Science, University of Dortmund, Germany {mierswa, wurst}@ls8.cs.uni-dortmund.de
More informationWrapper Feature Selection using Discrete Cuckoo Optimization Algorithm Abstract S.J. Mousavirad and H. Ebrahimpour-Komleh* 1 Department of Computer and Electrical Engineering, University of Kashan, Kashan,
More informationGenetic Programming. Charles Chilaka. Department of Computational Science Memorial University of Newfoundland
Genetic Programming Charles Chilaka Department of Computational Science Memorial University of Newfoundland Class Project for Bio 4241 March 27, 2014 Charles Chilaka (MUN) Genetic algorithms and programming
More informationKBSVM: KMeans-based SVM for Business Intelligence
Association for Information Systems AIS Electronic Library (AISeL) AMCIS 2004 Proceedings Americas Conference on Information Systems (AMCIS) December 2004 KBSVM: KMeans-based SVM for Business Intelligence
More informationISSN: [Keswani* et al., 7(1): January, 2018] Impact Factor: 4.116
IJESRT INTERNATIONAL JOURNAL OF ENGINEERING SCIENCES & RESEARCH TECHNOLOGY AUTOMATIC TEST CASE GENERATION FOR PERFORMANCE ENHANCEMENT OF SOFTWARE THROUGH GENETIC ALGORITHM AND RANDOM TESTING Bright Keswani,
More informationMODELLING DOCUMENT CATEGORIES BY EVOLUTIONARY LEARNING OF TEXT CENTROIDS
MODELLING DOCUMENT CATEGORIES BY EVOLUTIONARY LEARNING OF TEXT CENTROIDS J.I. Serrano M.D. Del Castillo Instituto de Automática Industrial CSIC. Ctra. Campo Real km.0 200. La Poveda. Arganda del Rey. 28500
More informationResearch Article Path Planning Using a Hybrid Evolutionary Algorithm Based on Tree Structure Encoding
e Scientific World Journal, Article ID 746260, 8 pages http://dx.doi.org/10.1155/2014/746260 Research Article Path Planning Using a Hybrid Evolutionary Algorithm Based on Tree Structure Encoding Ming-Yi
More informationReview on Methods of Selecting Number of Hidden Nodes in Artificial Neural Network
Available Online at www.ijcsmc.com International Journal of Computer Science and Mobile Computing A Monthly Journal of Computer Science and Information Technology IJCSMC, Vol. 3, Issue. 11, November 2014,
More informationImage Compression: An Artificial Neural Network Approach
Image Compression: An Artificial Neural Network Approach Anjana B 1, Mrs Shreeja R 2 1 Department of Computer Science and Engineering, Calicut University, Kuttippuram 2 Department of Computer Science and
More informationReview of feature selection techniques in bioinformatics by Yvan Saeys, Iñaki Inza and Pedro Larrañaga.
Americo Pereira, Jan Otto Review of feature selection techniques in bioinformatics by Yvan Saeys, Iñaki Inza and Pedro Larrañaga. ABSTRACT In this paper we want to explain what feature selection is and
More informationMetaheuristic Optimization with Evolver, Genocop and OptQuest
Metaheuristic Optimization with Evolver, Genocop and OptQuest MANUEL LAGUNA Graduate School of Business Administration University of Colorado, Boulder, CO 80309-0419 Manuel.Laguna@Colorado.EDU Last revision:
More information4/22/2014. Genetic Algorithms. Diwakar Yagyasen Department of Computer Science BBDNITM. Introduction
4/22/24 s Diwakar Yagyasen Department of Computer Science BBDNITM Visit dylycknow.weebly.com for detail 2 The basic purpose of a genetic algorithm () is to mimic Nature s evolutionary approach The algorithm
More informationCLASSIFICATION WITH RADIAL BASIS AND PROBABILISTIC NEURAL NETWORKS
CLASSIFICATION WITH RADIAL BASIS AND PROBABILISTIC NEURAL NETWORKS CHAPTER 4 CLASSIFICATION WITH RADIAL BASIS AND PROBABILISTIC NEURAL NETWORKS 4.1 Introduction Optical character recognition is one of
More informationRETRACTED ARTICLE. Web-Based Data Mining in System Design and Implementation. Open Access. Jianhu Gong 1* and Jianzhi Gong 2
Send Orders for Reprints to reprints@benthamscience.ae The Open Automation and Control Systems Journal, 2014, 6, 1907-1911 1907 Web-Based Data Mining in System Design and Implementation Open Access Jianhu
More informationA Genetic Algorithm for Graph Matching using Graph Node Characteristics 1 2
Chapter 5 A Genetic Algorithm for Graph Matching using Graph Node Characteristics 1 2 Graph Matching has attracted the exploration of applying new computing paradigms because of the large number of applications
More informationFEATURE EXTRACTION TECHNIQUES USING SUPPORT VECTOR MACHINES IN DISEASE PREDICTION
FEATURE EXTRACTION TECHNIQUES USING SUPPORT VECTOR MACHINES IN DISEASE PREDICTION Sandeep Kaur 1, Dr. Sheetal Kalra 2 1,2 Computer Science Department, Guru Nanak Dev University RC, Jalandhar(India) ABSTRACT
More informationEnhancing Forecasting Performance of Naïve-Bayes Classifiers with Discretization Techniques
24 Enhancing Forecasting Performance of Naïve-Bayes Classifiers with Discretization Techniques Enhancing Forecasting Performance of Naïve-Bayes Classifiers with Discretization Techniques Ruxandra PETRE
More informationInformation Fusion Dr. B. K. Panigrahi
Information Fusion By Dr. B. K. Panigrahi Asst. Professor Department of Electrical Engineering IIT Delhi, New Delhi-110016 01/12/2007 1 Introduction Classification OUTLINE K-fold cross Validation Feature
More informationLecture 6: Genetic Algorithm. An Introduction to Meta-Heuristics, Produced by Qiangfu Zhao (Since 2012), All rights reserved
Lecture 6: Genetic Algorithm An Introduction to Meta-Heuristics, Produced by Qiangfu Zhao (Since 2012), All rights reserved Lec06/1 Search and optimization again Given a problem, the set of all possible
More informationGenetic Algorithm Performance with Different Selection Methods in Solving Multi-Objective Network Design Problem
etic Algorithm Performance with Different Selection Methods in Solving Multi-Objective Network Design Problem R. O. Oladele Department of Computer Science University of Ilorin P.M.B. 1515, Ilorin, NIGERIA
More informationREAL-CODED GENETIC ALGORITHMS CONSTRAINED OPTIMIZATION. Nedim TUTKUN
REAL-CODED GENETIC ALGORITHMS CONSTRAINED OPTIMIZATION Nedim TUTKUN nedimtutkun@gmail.com Outlines Unconstrained Optimization Ackley s Function GA Approach for Ackley s Function Nonlinear Programming Penalty
More informationRegularization of Evolving Polynomial Models
Regularization of Evolving Polynomial Models Pavel Kordík Dept. of Computer Science and Engineering, Karlovo nám. 13, 121 35 Praha 2, Czech Republic kordikp@fel.cvut.cz Abstract. Black box models such
More informationDistributed Optimization of Feature Mining Using Evolutionary Techniques
Distributed Optimization of Feature Mining Using Evolutionary Techniques Karthik Ganesan Pillai University of Dayton Computer Science 300 College Park Dayton, OH 45469-2160 Dale Emery Courte University
More informationAn Effective Performance of Feature Selection with Classification of Data Mining Using SVM Algorithm
Proceedings of the National Conference on Recent Trends in Mathematical Computing NCRTMC 13 427 An Effective Performance of Feature Selection with Classification of Data Mining Using SVM Algorithm A.Veeraswamy
More informationResearch on Applications of Data Mining in Electronic Commerce. Xiuping YANG 1, a
International Conference on Education Technology, Management and Humanities Science (ETMHS 2015) Research on Applications of Data Mining in Electronic Commerce Xiuping YANG 1, a 1 Computer Science Department,
More informationi-eeg: A Software Tool for EEG Feature Extraction, Feature Selection and Classification
i-eeg: A Software Tool for EEG Feature Extraction, Feature Selection and Classification Baha ŞEN Computer Engineering Department, Yıldırım Beyazıt University, Ulus, Ankara, TURKEY Musa PEKER Computer Engineering
More informationTraffic Signs Recognition using HP and HOG Descriptors Combined to MLP and SVM Classifiers
Traffic Signs Recognition using HP and HOG Descriptors Combined to MLP and SVM Classifiers A. Salhi, B. Minaoui, M. Fakir, H. Chakib, H. Grimech Faculty of science and Technology Sultan Moulay Slimane
More informationBinary Decision Tree Using K-Means and Genetic Algorithm for Recognizing Defect Patterns of Cold Mill Strip
Binary Decision Tree Using K-Means and Genetic Algorithm for Recognizing Defect Patterns of Cold Mill Strip Kyoung Min Kim,4, Joong Jo Park, Myung Hyun Song 3, In Cheol Kim, and Ching Y. Suen Centre for
More informationET-based Test Data Generation for Multiple-path Testing
2016 3 rd International Conference on Engineering Technology and Application (ICETA 2016) ISBN: 978-1-60595-383-0 ET-based Test Data Generation for Multiple-path Testing Qingjie Wei* College of Computer
More informationUsing Genetic Algorithms to Improve Pattern Classification Performance
Using Genetic Algorithms to Improve Pattern Classification Performance Eric I. Chang and Richard P. Lippmann Lincoln Laboratory, MIT Lexington, MA 021739108 Abstract Genetic algorithms were used to select
More informationA Steady-State Genetic Algorithm for Traveling Salesman Problem with Pickup and Delivery
A Steady-State Genetic Algorithm for Traveling Salesman Problem with Pickup and Delivery Monika Sharma 1, Deepak Sharma 2 1 Research Scholar Department of Computer Science and Engineering, NNSS SGI Samalkha,
More information4. Feedforward neural networks. 4.1 Feedforward neural network structure
4. Feedforward neural networks 4.1 Feedforward neural network structure Feedforward neural network is one of the most common network architectures. Its structure and some basic preprocessing issues required
More informationNeural Network Weight Selection Using Genetic Algorithms
Neural Network Weight Selection Using Genetic Algorithms David Montana presented by: Carl Fink, Hongyi Chen, Jack Cheng, Xinglong Li, Bruce Lin, Chongjie Zhang April 12, 2005 1 Neural Networks Neural networks
More informationAN EVOLUTIONARY APPROACH TO DISTANCE VECTOR ROUTING
International Journal of Latest Research in Science and Technology Volume 3, Issue 3: Page No. 201-205, May-June 2014 http://www.mnkjournals.com/ijlrst.htm ISSN (Online):2278-5299 AN EVOLUTIONARY APPROACH
More informationMachine Learning in Biology
Università degli studi di Padova Machine Learning in Biology Luca Silvestrin (Dottorando, XXIII ciclo) Supervised learning Contents Class-conditional probability density Linear and quadratic discriminant
More informationRandom Forest A. Fornaser
Random Forest A. Fornaser alberto.fornaser@unitn.it Sources Lecture 15: decision trees, information theory and random forests, Dr. Richard E. Turner Trees and Random Forests, Adele Cutler, Utah State University
More informationMultiobjective Job-Shop Scheduling With Genetic Algorithms Using a New Representation and Standard Uniform Crossover
Multiobjective Job-Shop Scheduling With Genetic Algorithms Using a New Representation and Standard Uniform Crossover J. Garen 1 1. Department of Economics, University of Osnabrück, Katharinenstraße 3,
More informationRobustness of Selective Desensitization Perceptron Against Irrelevant and Partially Relevant Features in Pattern Classification
Robustness of Selective Desensitization Perceptron Against Irrelevant and Partially Relevant Features in Pattern Classification Tomohiro Tanno, Kazumasa Horie, Jun Izawa, and Masahiko Morita University
More informationEvolutionary Linkage Creation between Information Sources in P2P Networks
Noname manuscript No. (will be inserted by the editor) Evolutionary Linkage Creation between Information Sources in P2P Networks Kei Ohnishi Mario Köppen Kaori Yoshida Received: date / Accepted: date Abstract
More informationIntegration of Public Information at the Regional Level Challenges and Opportunities *
Integration of Public Information at the Regional Level Challenges and Opportunities * Leon Bobrowski, Mariusz Buzun, and Karol Przybszewski Faculty of Computer Science, Bialystok Technical University,
More informationTowards Automatic Recognition of Fonts using Genetic Approach
Towards Automatic Recognition of Fonts using Genetic Approach M. SARFRAZ Department of Information and Computer Science King Fahd University of Petroleum and Minerals KFUPM # 1510, Dhahran 31261, Saudi
More informationLiterature Review On Implementing Binary Knapsack problem
Literature Review On Implementing Binary Knapsack problem Ms. Niyati Raj, Prof. Jahnavi Vitthalpura PG student Department of Information Technology, L.D. College of Engineering, Ahmedabad, India Assistant
More informationGrid Scheduling Strategy using GA (GSSGA)
F Kurus Malai Selvi et al,int.j.computer Technology & Applications,Vol 3 (5), 8-86 ISSN:2229-693 Grid Scheduling Strategy using GA () Dr.D.I.George Amalarethinam Director-MCA & Associate Professor of Computer
More informationThe Design of Pole Placement With Integral Controllers for Gryphon Robot Using Three Evolutionary Algorithms
The Design of Pole Placement With Integral Controllers for Gryphon Robot Using Three Evolutionary Algorithms Somayyeh Nalan-Ahmadabad and Sehraneh Ghaemi Abstract In this paper, pole placement with integral
More informationCHAPTER 2 CONVENTIONAL AND NON-CONVENTIONAL TECHNIQUES TO SOLVE ORPD PROBLEM
20 CHAPTER 2 CONVENTIONAL AND NON-CONVENTIONAL TECHNIQUES TO SOLVE ORPD PROBLEM 2.1 CLASSIFICATION OF CONVENTIONAL TECHNIQUES Classical optimization methods can be classified into two distinct groups:
More informationTopological Machining Fixture Layout Synthesis Using Genetic Algorithms
Topological Machining Fixture Layout Synthesis Using Genetic Algorithms Necmettin Kaya Uludag University, Mechanical Eng. Department, Bursa, Turkey Ferruh Öztürk Uludag University, Mechanical Eng. Department,
More informationOptimized Watermarking Using Swarm-Based Bacterial Foraging
Journal of Information Hiding and Multimedia Signal Processing c 2009 ISSN 2073-4212 Ubiquitous International Volume 1, Number 1, January 2010 Optimized Watermarking Using Swarm-Based Bacterial Foraging
More informationFall 09, Homework 5
5-38 Fall 09, Homework 5 Due: Wednesday, November 8th, beginning of the class You can work in a group of up to two people. This group does not need to be the same group as for the other homeworks. You
More informationCT79 SOFT COMPUTING ALCCS-FEB 2014
Q.1 a. Define Union, Intersection and complement operations of Fuzzy sets. For fuzzy sets A and B Figure Fuzzy sets A & B The union of two fuzzy sets A and B is a fuzzy set C, written as C=AUB or C=A OR
More informationMetaheuristic Development Methodology. Fall 2009 Instructor: Dr. Masoud Yaghini
Metaheuristic Development Methodology Fall 2009 Instructor: Dr. Masoud Yaghini Phases and Steps Phases and Steps Phase 1: Understanding Problem Step 1: State the Problem Step 2: Review of Existing Solution
More informationJob Shop Scheduling Problem (JSSP) Genetic Algorithms Critical Block and DG distance Neighbourhood Search
A JOB-SHOP SCHEDULING PROBLEM (JSSP) USING GENETIC ALGORITHM (GA) Mahanim Omar, Adam Baharum, Yahya Abu Hasan School of Mathematical Sciences, Universiti Sains Malaysia 11800 Penang, Malaysia Tel: (+)
More informationSome questions of consensus building using co-association
Some questions of consensus building using co-association VITALIY TAYANOV Polish-Japanese High School of Computer Technics Aleja Legionow, 4190, Bytom POLAND vtayanov@yahoo.com Abstract: In this paper
More informationFuzzy Inspired Hybrid Genetic Approach to Optimize Travelling Salesman Problem
Fuzzy Inspired Hybrid Genetic Approach to Optimize Travelling Salesman Problem Bindu Student, JMIT Radaur binduaahuja@gmail.com Mrs. Pinki Tanwar Asstt. Prof, CSE, JMIT Radaur pinki.tanwar@gmail.com Abstract
More informationGENETIC ALGORITHM METHOD FOR COMPUTER AIDED QUALITY CONTROL
3 rd Research/Expert Conference with International Participations QUALITY 2003, Zenica, B&H, 13 and 14 November, 2003 GENETIC ALGORITHM METHOD FOR COMPUTER AIDED QUALITY CONTROL Miha Kovacic, Miran Brezocnik
More informationGenetic Algorithms. PHY 604: Computational Methods in Physics and Astrophysics II
Genetic Algorithms Genetic Algorithms Iterative method for doing optimization Inspiration from biology General idea (see Pang or Wikipedia for more details): Create a collection of organisms/individuals
More informationCHAPTER 6 HYBRID AI BASED IMAGE CLASSIFICATION TECHNIQUES
CHAPTER 6 HYBRID AI BASED IMAGE CLASSIFICATION TECHNIQUES 6.1 INTRODUCTION The exploration of applications of ANN for image classification has yielded satisfactory results. But, the scope for improving
More informationApplying Supervised Learning
Applying Supervised Learning When to Consider Supervised Learning A supervised learning algorithm takes a known set of input data (the training set) and known responses to the data (output), and trains
More informationOn optimal channel configurations for SMR based brain computer interfaces
On optimal channel configurations for SMR based brain computer interfaces Claudia Sannelli a, Thorsten Dickhaus a, Sebastian Halder c, Eva Maria Hammer c, Klaus Robert Müller a, Benjamin Blankertz a,b
More informationCS 229 Midterm Review
CS 229 Midterm Review Course Staff Fall 2018 11/2/2018 Outline Today: SVMs Kernels Tree Ensembles EM Algorithm / Mixture Models [ Focus on building intuition, less so on solving specific problems. Ask
More informationTraveling Salesman Problem. Java Genetic Algorithm Solution
Traveling Salesman Problem Java Genetic Algorithm Solution author: Dušan Saiko 23.08.2005 Index Introduction...2 Genetic algorithms...2 Different approaches...5 Application description...10 Summary...15
More informationRelevance Feedback for Content-Based Image Retrieval Using Support Vector Machines and Feature Selection
Relevance Feedback for Content-Based Image Retrieval Using Support Vector Machines and Feature Selection Apostolos Marakakis 1, Nikolaos Galatsanos 2, Aristidis Likas 3, and Andreas Stafylopatis 1 1 School
More informationFlexible-Hybrid Sequential Floating Search in Statistical Feature Selection
Flexible-Hybrid Sequential Floating Search in Statistical Feature Selection Petr Somol 1,2, Jana Novovičová 1,2, and Pavel Pudil 2,1 1 Dept. of Pattern Recognition, Institute of Information Theory and
More informationIntelligent Reduction of Tire Noise
Intelligent Reduction of Tire Noise Matthias Becker and Helena Szczerbicka University Hannover Welfengarten 3067 Hannover, Germany xmb@sim.uni-hannover.de Abstract. In this paper we report about deployment
More informationUse of Mean Square Error Measure in Biometric Analysis of Fingerprint Tests
Journal of Information Hiding and Multimedia Signal Processing c 2015 ISSN 2073-4212 Ubiquitous International Volume 6, Number 5, September 2015 Use of Mean Square Error Measure in Biometric Analysis of
More informationStructural Optimizations of a 12/8 Switched Reluctance Motor using a Genetic Algorithm
International Journal of Sustainable Transportation Technology Vol. 1, No. 1, April 2018, 30-34 30 Structural Optimizations of a 12/8 Switched Reluctance using a Genetic Algorithm Umar Sholahuddin 1*,
More informationAssignment 2. Classification and Regression using Linear Networks, Multilayer Perceptron Networks, and Radial Basis Functions
ENEE 739Q: STATISTICAL AND NEURAL PATTERN RECOGNITION Spring 2002 Assignment 2 Classification and Regression using Linear Networks, Multilayer Perceptron Networks, and Radial Basis Functions Aravind Sundaresan
More informationDistance Weighted Discrimination Method for Parkinson s for Automatic Classification of Rehabilitative Speech Treatment for Parkinson s Patients
Operations Research II Project Distance Weighted Discrimination Method for Parkinson s for Automatic Classification of Rehabilitative Speech Treatment for Parkinson s Patients Nicol Lo 1. Introduction
More informationA Web-Based Evolutionary Algorithm Demonstration using the Traveling Salesman Problem
A Web-Based Evolutionary Algorithm Demonstration using the Traveling Salesman Problem Richard E. Mowe Department of Statistics St. Cloud State University mowe@stcloudstate.edu Bryant A. Julstrom Department
More informationA Genetic Algorithm for the Multiple Knapsack Problem in Dynamic Environment
, 23-25 October, 2013, San Francisco, USA A Genetic Algorithm for the Multiple Knapsack Problem in Dynamic Environment Ali Nadi Ünal Abstract The 0/1 Multiple Knapsack Problem is an important class of
More informationCHAPTER 4 DETECTION OF DISEASES IN PLANT LEAF USING IMAGE SEGMENTATION
CHAPTER 4 DETECTION OF DISEASES IN PLANT LEAF USING IMAGE SEGMENTATION 4.1. Introduction Indian economy is highly dependent of agricultural productivity. Therefore, in field of agriculture, detection of
More informationSupport Vector Machines
Support Vector Machines Chapter 9 Chapter 9 1 / 50 1 91 Maximal margin classifier 2 92 Support vector classifiers 3 93 Support vector machines 4 94 SVMs with more than two classes 5 95 Relationshiop to
More informationAutomata Construct with Genetic Algorithm
Automata Construct with Genetic Algorithm Vít Fábera Department of Informatics and Telecommunication, Faculty of Transportation Sciences, Czech Technical University, Konviktská 2, Praha, Czech Republic,
More informationInducing Parameters of a Decision Tree for Expert System Shell McESE by Genetic Algorithm
Inducing Parameters of a Decision Tree for Expert System Shell McESE by Genetic Algorithm I. Bruha and F. Franek Dept of Computing & Software, McMaster University Hamilton, Ont., Canada, L8S4K1 Email:
More informationIntroduction (7.1) Genetic Algorithms (GA) (7.2) Simulated Annealing (SA) (7.3) Random Search (7.4) Downhill Simplex Search (DSS) (7.
Chapter 7: Derivative-Free Optimization Introduction (7.1) Genetic Algorithms (GA) (7.2) Simulated Annealing (SA) (7.3) Random Search (7.4) Downhill Simplex Search (DSS) (7.5) Jyh-Shing Roger Jang et al.,
More informationKhushboo Arora, Samiksha Agarwal, Rohit Tanwar
International Journal of Scientific & Engineering Research, Volume 7, Issue 1, January-2016 1014 Solving TSP using Genetic Algorithm and Nearest Neighbour Algorithm and their Comparison Khushboo Arora,
More informationCHAPTER 6 REAL-VALUED GENETIC ALGORITHMS
CHAPTER 6 REAL-VALUED GENETIC ALGORITHMS 6.1 Introduction Gradient-based algorithms have some weaknesses relative to engineering optimization. Specifically, it is difficult to use gradient-based algorithms
More informationLeave-One-Out Support Vector Machines
Leave-One-Out Support Vector Machines Jason Weston Department of Computer Science Royal Holloway, University of London, Egham Hill, Egham, Surrey, TW20 OEX, UK. Abstract We present a new learning algorithm
More informationGT HEURISTIC FOR SOLVING MULTI OBJECTIVE JOB SHOP SCHEDULING PROBLEMS
GT HEURISTIC FOR SOLVING MULTI OBJECTIVE JOB SHOP SCHEDULING PROBLEMS M. Chandrasekaran 1, D. Lakshmipathy 1 and P. Sriramya 2 1 Department of Mechanical Engineering, Vels University, Chennai, India 2
More informationSlides for Data Mining by I. H. Witten and E. Frank
Slides for Data Mining by I. H. Witten and E. Frank 7 Engineering the input and output Attribute selection Scheme-independent, scheme-specific Attribute discretization Unsupervised, supervised, error-
More informationNOVEL HYBRID GENETIC ALGORITHM WITH HMM BASED IRIS RECOGNITION
NOVEL HYBRID GENETIC ALGORITHM WITH HMM BASED IRIS RECOGNITION * Prof. Dr. Ban Ahmed Mitras ** Ammar Saad Abdul-Jabbar * Dept. of Operation Research & Intelligent Techniques ** Dept. of Mathematics. College
More information