Comparison of PSO-Based Optimized Feature Computation for Automated Configuration of Multi-Sensor Systems

Similar documents
Traffic Signal Control Based On Fuzzy Artificial Neural Networks With Particle Swarm Optimization

Research Article Path Planning Using a Hybrid Evolutionary Algorithm Based on Tree Structure Encoding

Hybrid Particle Swarm-Based-Simulated Annealing Optimization Techniques

International Journal of Digital Application & Contemporary research Website: (Volume 1, Issue 7, February 2013)

Application of Improved Discrete Particle Swarm Optimization in Logistics Distribution Routing Problem

Distributed Optimization of Feature Mining Using Evolutionary Techniques

ARMA MODEL SELECTION USING PARTICLE SWARM OPTIMIZATION AND AIC CRITERIA. Mark S. Voss a b. and Xin Feng.

GENETIC ALGORITHM VERSUS PARTICLE SWARM OPTIMIZATION IN N-QUEEN PROBLEM

Meta- Heuristic based Optimization Algorithms: A Comparative Study of Genetic Algorithm and Particle Swarm Optimization

Domain Independent Prediction with Evolutionary Nearest Neighbors.

IEEE TRANSACTIONS ON EVOLUTIONARY COMPUTATION, VOL. 5, NO. 1, FEBRUARY

SIMULTANEOUS COMPUTATION OF MODEL ORDER AND PARAMETER ESTIMATION FOR ARX MODEL BASED ON MULTI- SWARM PARTICLE SWARM OPTIMIZATION

Tracking Changing Extrema with Particle Swarm Optimizer

Inertia Weight. v i = ωv i +φ 1 R(0,1)(p i x i )+φ 2 R(0,1)(p g x i ) The new velocity update equation:

1. Introduction. 2. Motivation and Problem Definition. Volume 8 Issue 2, February Susmita Mohapatra

Binary Representations of Integers and the Performance of Selectorecombinative Genetic Algorithms

Performance Assessment of DMOEA-DD with CEC 2009 MOEA Competition Test Instances

1 Lab + Hwk 5: Particle Swarm Optimization

GRANULAR COMPUTING AND EVOLUTIONARY FUZZY MODELLING FOR MECHANICAL PROPERTIES OF ALLOY STEELS. G. Panoutsos and M. Mahfouf

Genetic-PSO Fuzzy Data Mining With Divide and Conquer Strategy

Design of Nearest Neighbor Classifiers Using an Intelligent Multi-objective Evolutionary Algorithm

The Genetic Algorithm for finding the maxima of single-variable functions

Feature weighting using particle swarm optimization for learning vector quantization classifier

Design of an Optimal Nearest Neighbor Classifier Using an Intelligent Genetic Algorithm

CHAPTER 6 HYBRID AI BASED IMAGE CLASSIFICATION TECHNIQUES

Particle Swarm Optimization

ARTIFICIAL INTELLIGENCE (CSCU9YE ) LECTURE 5: EVOLUTIONARY ALGORITHMS

Particle swarm optimization for mobile network design

Improving Tree-Based Classification Rules Using a Particle Swarm Optimization

Chapter 14 Global Search Algorithms

An Island Based Hybrid Evolutionary Algorithm for Optimization

Seismic regionalization based on an artificial neural network

Methodology, Algorithms, and Emerging Tool for Automated Design of Intelligent Integrated Multi-Sensor Systems

An improved PID neural network controller for long time delay systems using particle swarm optimization algorithm

A NEW METHODOLOGY FOR EMERGENT SYSTEM IDENTIFICATION USING PARTICLE SWARM OPTIMIZATION (PSO) AND THE GROUP METHOD OF DATA HANDLING (GMDH)

Approach Using Genetic Algorithm for Intrusion Detection System

Binary Differential Evolution Strategies

CHAPTER 6 ORTHOGONAL PARTICLE SWARM OPTIMIZATION

Artificial Bee Colony (ABC) Optimization Algorithm for Solving Constrained Optimization Problems

[Kaur, 5(8): August 2018] ISSN DOI /zenodo Impact Factor

Mobile Robot Path Planning in Static Environments using Particle Swarm Optimization

GA is the most popular population based heuristic algorithm since it was developed by Holland in 1975 [1]. This algorithm runs faster and requires les

Feeder Reconfiguration Using Binary Coding Particle Swarm Optimization

Novel Initialisation and Updating Mechanisms in PSO for Feature Selection in Classification

Multi-objective pattern and feature selection by a genetic algorithm

Comparison of Some Evolutionary Algorithms for Approximate Solutions of Optimal Control Problems

A HYBRID ALGORITHM BASED ON PARTICLE SWARM OPTIMIZATION

Using a genetic algorithm for editing k-nearest neighbor classifiers

ISSN: [Keswani* et al., 7(1): January, 2018] Impact Factor: 4.116

1 Lab 5: Particle Swarm Optimization

Fuzzy Ant Clustering by Centroid Positioning

Genetic Model Optimization for Hausdorff Distance-Based Face Localization

IMPROVING THE PARTICLE SWARM OPTIMIZATION ALGORITHM USING THE SIMPLEX METHOD AT LATE STAGE

A THREAD BUILDING BLOCKS BASED PARALLEL GENETIC ALGORITHM

Argha Roy* Dept. of CSE Netaji Subhash Engg. College West Bengal, India.

1 Lab + Hwk 5: Particle Swarm Optimization

The movement of the dimmer firefly i towards the brighter firefly j in terms of the dimmer one s updated location is determined by the following equat

Experimental Study on Bound Handling Techniques for Multi-Objective Particle Swarm Optimization

Santa Fe Trail Problem Solution Using Grammatical Evolution

THREE PHASE FAULT DIAGNOSIS BASED ON RBF NEURAL NETWORK OPTIMIZED BY PSO ALGORITHM

CHAPTER 2 CONVENTIONAL AND NON-CONVENTIONAL TECHNIQUES TO SOLVE ORPD PROBLEM

Adaptive Crossover in Genetic Algorithms Using Statistics Mechanism

A *69>H>N6 #DJGC6A DG C<>C::G>C<,8>:C8:H /DA 'D 2:6G, ()-"&"3 -"(' ( +-" " " % '.+ % ' -0(+$,

Segmentation of Noisy Binary Images Containing Circular and Elliptical Objects using Genetic Algorithms

Handling Multi Objectives of with Multi Objective Dynamic Particle Swarm Optimization

An Application of Genetic Algorithm for Auto-body Panel Die-design Case Library Based on Grid

Network Routing Protocol using Genetic Algorithms

Cooperative Coevolution using The Brain Storm Optimization Algorithm

Attribute Selection with a Multiobjective Genetic Algorithm

The Design of Pole Placement With Integral Controllers for Gryphon Robot Using Three Evolutionary Algorithms

Monika Maharishi Dayanand University Rohtak

Recombination of Similar Parents in EMO Algorithms

Improving interpretability in approximative fuzzy models via multi-objective evolutionary algorithms.

Particle Swarm Optimization applied to Pattern Recognition

Discrete Multi-Valued Particle Swarm Optimization

Using Genetic Algorithms to Improve Pattern Classification Performance

DISTANCE EVALUATED SIMULATED KALMAN FILTER FOR COMBINATORIAL OPTIMIZATION PROBLEMS

Automatic differentiation based for particle swarm optimization steepest descent direction

FEATURE SELECTION USING PARTICLE SWARM OPTIMIZATION IN TEXT CATEGORIZATION

Information Fusion Dr. B. K. Panigrahi

Discrete Particle Swarm Optimization for TSP based on Neighborhood

Inducing Parameters of a Decision Tree for Expert System Shell McESE by Genetic Algorithm

An Optimization of Association Rule Mining Algorithm using Weighted Quantum behaved PSO

Witold Pedrycz. University of Alberta Edmonton, Alberta, Canada

Generation of Ultra Side lobe levels in Circular Array Antennas using Evolutionary Algorithms

Comparing Classification Performances between Neural Networks and Particle Swarm Optimization for Traffic Sign Recognition

NOVEL HYBRID GENETIC ALGORITHM WITH HMM BASED IRIS RECOGNITION

Color-Based Classification of Natural Rock Images Using Classifier Combinations


Particle Swarm Optimization to Solve Optimization Problems

Neural Network Weight Selection Using Genetic Algorithms

Coevolving Functions in Genetic Programming: Classification using K-nearest-neighbour

Feeding the Fish Weight Update Strategies for the Fish School Search Algorithm

A Genetic Algorithm-Based Approach for Energy- Efficient Clustering of Wireless Sensor Networks

An Approach to Polygonal Approximation of Digital CurvesBasedonDiscreteParticleSwarmAlgorithm

An explicit feature control approach in structural topology optimization

Solving the Hard Knapsack Problems with a Binary Particle Swarm Approach

Weighting and selection of features.

Modified Particle Swarm Optimization

A Hybrid Fireworks Optimization Method with Differential Evolution Operators

Transcription:

Comparison of PSO-Based Optimized Feature Computation for Automated Configuration of Multi-Sensor Systems Kuncup Iswandy and Andreas Koenig Institute of Integrated Sensor Systems, University of Kaiserslautern, 67663 Kaiserslautern, Germany {kuncup@rhrk, koenig@eit}.uni-kl.de Abstract. The design of intelligent sensor systems requires sophisticated methods from conventional signal processing and computational intelligence. Currently, a significant part of the overall system architecture still has to be manually elaborated in a tedious and time consuming process by an experienced designer. Clearly, an automatic method for auto-configuration of sensor systems would be salient. In this paper, we contribute to the optimization of the feature computation step in the overall system design, investigating multi-level thresholding (MLT) and Gaussian windowing. Our goals are to compare these two feature computation methods and two evolutionary optimization techniques, i.e., genetic algorithm (GA) and particle swarm optimization (PSO). To compare with previous research work gas sensor benchmark data is used. In the comparison of GA and PSO the latter method provided superior results of 100% recognition in generalization for thresholding, which proved to be more powerful method. 1 Introduction Intelligent sensor systems find more and more widespread application, for instance, as the most recent example the fields of ambient intelligence and sensor networks can be mentioned. This remarkable increase of application is also due to the growing spectrum of available sensor principles and implementations. These require, however, also a larger variety sensor electronics and sensor signal processing techniques to efficiently employ them in systems. The design of intelligent sensor systems is still predominantly conducted manually, where the design process goes through the principal steps of sensor selection and scene optimization, choice of signal and feature processing, dimensionality reduction, and classification (see Fig. 1). The processing steps of dimensionality reduction and classification are more and more subject of automation efforts employing learning and optimization techniques. However, the decisive task of heuristic signal processing and feature computation method selection, combination, and parameter setting is currently left to the human designer as a tedious, time and labor consuming task with potentially suboptimal outcome. In particular, the strong diversity of available methods and tools from conventional signal processing to computational intelligence techniques imposes severe challenges on the experiences and qualifications of the designer. It is our overall research goal to contribute to the design automation activities for intelligent multi-sensor systems, in this paper we focus on the optimization of feature

2 Kuncup Iswandy and Andreas Koenig Sensor & Scene Signal Processing & Feature Computation Dimension Reduction Classification Parameter Assessment Optimization Fig. 1. General architecture of intelligent sensor systems. computation regarding two standard methods, multilevel thresholding [1] and Gaussian windowing [2], which will be optimized by evolutionary techniques, genetic algorithms (GA) [3] and particle swarm optimization (PSO) [4] according to benchmark data from gas sensor data classification. In the next section, the two feature computation methods will be described, which are the first two instances of a feature computation method library we are currently establishing in our work. In the third section, the employed GA and PSO techniques are explained along with the method parameter settings. In the fourth section we will present experiment and results for the benchmark data. Concluding, we will give an outlook of our envisioned next steps toward automated design of intelligent sensor systems. 2 Feature Computation Methods The roles of feature computation techniques are to extract the meaningful information of raw data of sensor response patterns and to reduce the dimension size of feature vector of a pattern, where it can increase the speed of computation and the accuracy of pattern classification. In particular, with regard to the application of gas sensor systems two feature computation techniques have been proposed, i.e., multi-level thresholding (MLT) [1] and Gaussian windowing [2]. In the applying of MLT, the first derivative of the conductance or slope curves as a further processing is used. The MLT techniques compute the features with similar to histogram and amplitude distribution computation. There are two ways to compute features using multi-level thresholding techniques, i.e., differential and cumulative modes, which count number of samples of signal resposes lying in range between two thresholds. Figure 2 illustrates cumulative and differential feature computation. The features of MLT differential mode can be computed as z i = Nr s=1 δ(y s,t p,t q ) = δ(y s,t p,t q ), (1) { 1 T p y s T q 0 otherwise, (2)

Comparison of PSO-Based Optimized Feature Computation 3 1000 800 First Derivative of Conductance Curve 600 400 cumulative mode differential mode slope [a.u.] 200 0 200 400 600 800 140 160 180 200 220 240 260 time in temperature cycle [ms] Fig. 2. Multi-level threshold for extracted features from gas sensor data [1]. where y s is magnitude value of sensor signal with s = 1,2,...,Nr and Nr is total samples of a pattern; i is the number of features (i = T 1) and T is number of thresholds used; and T p and T q are level-values with q = 2,3,..T and p = q 1. On the other hand, the MLT cumulative mode is computed in the similar way, except that q = T and p = 1,2,...,T 1. The Gaussian windows or kernels extract the features directly from conductance curves or transient responses. Each kernel is a Gaussian exponential function with a given mean and standard deviation. Each of these kernels is multiplied by sensor response and integrated with respect to time (see Fig. 3). The number of features used is same with the number of kernels. The features of Gaussian windowing can be computed as z i = Nr s=1 y s.g(s, µ i,σ i ), (3) G(s, µ i,σ i ) = exp 1 2 ( s µ i σ i ) 2. (4) 3 Optimization Methods 3.1 Genetic Algorithms Genetic algorithms (GA) are search algorithms based on the mechanics of natural selection and natural genetics. Genetic algorithms consider a population of chromosomes (individuals) encoding potential solutions to a given problem. The main characteristics of GA are the intensive use of randomness and genetic-inspired operations, i.e., selection, recombination (crossover) and mutation to evolve a set of candidate solutions. We adopt the main steps of GA applied in our previous work [1]. Briefly, the main steps of the GA adapted to the requirements of automated sensor system design are initialization (generate an initial population), selection for recombination (Roulette Wheel Selection), recombination (one point crossover), mutation, reproduction (with elitism 10%) and termination conditions (loop or stop criteria).

4 Kuncup Iswandy and Andreas Koenig 1 0.8 conductance [a.u.] 0.6 0.4 0.2 0 10 20 30 40 50 60 70 80 time [ms] Fig. 3. Gaussian windowing for window time slicing from a conductance curve of gas sensor data. 3.2 Particle Swarm Optimization Particle swarm optimization (PSO) is a non-linear method which also is affiliated to evolutionary computation techniques. Particle swarms explore the search space through a population of particles, which adapt by returning to previously successful regions [4]. The particles then fly over the state space, remembering the best solution encountered. The fitness function is determined by an application-specific objective function. Here, we use overlap assessment measure as the fitness function (see section 3.5). During each iteration, the velocity of each particle is adjusted based on its momentum and influence of the best solutions encountered by itself and its neighbors. The particles then move to a new position, and the process is repeated for a prescribed number of iterations. In the original PSO implementation, the trajectory of each particle is governed by the equations: and v i (t + 1) = ωv i (t) + c 1 rand() (p i x i (t)) +c 2 rand() (p g x i (t)) (5) x i (t + 1) = x i (t) + v i (t + 1) (6) where x i = (x i1,x i2,...,x id ) and v i are the current vector position and velocity of the i-th particle, p i is the position of the best state visited by the i-th particle, p g is the particle with the best fitness in the neighborhood of i, and t is the iteration number. The parameter c 1 and c 2 are called the cognitive and social learning rates. The parameter ω is an inertia weight, which used to dampen the velocity during the course of the simulation, and allow the swarm to converge with greater precision. 3.3 Optimizing Feature Computation In optimizing feature computation, the particle for one member of the population represents the array level values of thresholds for multi-level thresholdings techniques and

Comparison of PSO-Based Optimized Feature Computation 5 for Gaussian window function, the particles represent array pairs of mean and standard deviation values. In searching of optimal combination of thresholds, the MLT methods have a constraint, where the rank position of thresholds will not exceed the lower or higher rank threshold. 3.4 Optimizing Feature Selection The original PSO technique is designed for the real-value problems, whereas the feature selection only uses binary values to represent whether one feature is selected or not. Therefore, the algorithm now has been extended to tackle binary/discrete problems. Kennedy and Eberhart [5] have proposed binary PSO (BPSO), where uses velocity as a probability to determine whether the components of x i will be in one or zero state (binary). They squashed v i using a logistic function s(v) = 1/(1 + exp( v)) while the velocity is calculated using the same equation in Eq. (5). If a randomly generated number within [0,1] is less than s(v id ), then x id is set to be 1, otherwise it is set to be 0. The minimization of the number of features is not explicitly included in Eq.(7). Instead, we have added an implicit selecting condition, so that in the case of two or more different feature subsets with equal assessment values, the best particle will be set to the feature subset of the smallest cardinality. 3.5 Fitness Function The process of choosing a subset of features according to certain criteria can be divided into two groups, i.e., wrapper and filter approach [10]. The wrapper approach takes feature subset selection and classification as a whole and selects features based on classification results directly, while the filter approach utilizes statistics underlying training data and operates independently of classification algorithms [11]. The process of feature computation is also based on these approaches as shown in Fig. 4(a) and 4(b). For assessment of feature computation and feature selection in our work, the nonparametric overlap measurement [7] is applied. This measurement is an automatic feature selection of filter approach. The nonparametric overlap measure q o, which was inspired by the nearest neighbor concepts, provides a very fine-grained value range. This normalized measure gives values close to one for non-overlapping class regions and decreases towards zero proportional to increasingly overlapping of class regions. The overlap measure is computed by: with and q o = 1 L L c=1 1 N c N c k i=1 q NN ji + k i=1 n i j=1 2 k i=1 n i (7) n i = 1 d NN ji d NNjk (8) { n i ω j = ω i q NNji = n i ω j ω i. (9)

6 Kuncup Iswandy and Andreas Koenig param. Assessment and Modification R Sensor Raw Feature Computation Dimension. Reduction Classifier Train/Test classification result (a) Wrapper Approach Assessment and Modification Assessment and Modification param. qo param. qo Sensor Raw Feature Computation Dimension. Reduction Classifier Train/Test classification result (b) Filter Approach Fig. 4. The optimization model of recognition system. Here, n i denotes the weighting factor for the position of the ith nearest neighbor NN ji, d NNji denotes the Euclidean distance between z j and NN ji, d NNjk denotes the distance between z j and most distant nearest neighbor NN jk, q NNjk denotes the measure contribution of z j with regard to NN ji, L is the number of classes, and ω denotes the class affiliation of z. Typically, the number of nearest neighbors well suited for computation of this measurement are 5 to 10. 4 Experiments and Results In our experiment, we used a benchmark data of a gas sensor system [6] and [1] obtained from Laboratory for Measurement Technology, University of Saarland. The raw data of sensor response is obtained by using a commercial micro sensor MiCS 5131. These sensors possess a heating that is controlled during measurement in so called temperature cycles. The conductance of the sensors was measured every 10 ms with 16 bits resolution during one high temperature step (500 o C) with a duration of 800 ms, two temperature cycles with four levels (23 o C, 500 o C, 90 o C, and 290 o C) with a duration of 400 ms each level, and temperature off with a duration of 4.1 seconds. There are four test gases (H 2, CH 4, ethanol and CO) used and for each test gas, there are three different concentrations applied. All test gases were performed at two relative humidity values (30% and 70%). The data set consists of 810 measure values and 264 patterns, where each type of gases with different concentration and relative humidity has 11 patterns. Data samples of the four gases were merged to four classes, neglecting underlying differences in concentration and relative humidity. The available data set was separated into training and testing data set, with 144 and 120 patterns, respectively. This holdout method has been used for classification. We performed feature computation and feature selection only on the training set. Testing set has not been used for feature computation and feature selection because we want to make sure that classification results are not biased. In extending our experiments, the

Comparison of PSO-Based Optimized Feature Computation 7 1 1 0.99 fitness (overlap) value 0.995 0.99 GA PSO fitness (overlap) value 0.98 0.97 0.96 GA PSO 0.985 0.95 0.98 0 20 40 60 80 100 iteration (a) MLT differential mode 0.94 0 20 40 60 80 100 iteration (b) MLT cummulative mode Fig. 5. Comparison of the overlap assessment curves between GA and PSO. leave-one-out cross-validation approach applied on testing set was used for classification estimation. The control parameters of GA were set as in our prior work [1], where population size is 20 individuals, the number of iteration is 100, recombination rate is 0.8, mutation rate is 0.01 and elitism (reproduction) is 10% of the population size. The parameter settings of PSO in our experiments were determined as following: Each experiment is repeated using 10 runs and each run has 100 iterations. Population size is 20. ω, c 1, and c 2 are 1, 2, and 2 respectively. The number of nearest neighbors used is 5 for the knn voting classifier [9] and the overlap measurement. In the experiment, the number of window-kernels is used in range of 3 to 10 per temperature step (10 steps). In our prior work [1], genetic algorithms were used to optimize the multilevel thresholding (MLT) feature computation. In the first step of our experiment, we compared the results between GA and PSO techniques for optimizing the MLT feature computation for both differential and cumulative approaches. The number of thresholdings used for differential mode is nine and for cumulative mode five according to the best results of GA in our prior work [1]. Figure 5(a) and 5(b) show that the PSO performed better than the GA according to the mean and standard deviation of overlap assessment values. Moreover, for the classification accuracies from both training and test data sets, the PSO achieved superior results than GA as shown in Table 1 and Table 2. These conclude the comparison of results between PSO and GA that have been reported alike in relevant references (e.g., [8] and [13]). In the next experiment, we only used PSO for optimizing Gaussian windowing feature computation due to the PSO performances better than GA. There are no significantly different results for the alteration of number of kernels as shown in Table 3 with regarding to the mean and the standard of the overlap measurement values and classification accuracies. The results achieved by the multilevel thresholding using PSO

8 Kuncup Iswandy and Andreas Koenig Table 1. Comparison of MLT differential mode result between GA and PSO. MLT overlap Recognition accuracy (knn) CM q o train (%) test (%) test-loo (%) mean/std mean/std mean/std mean/std GA 0.9950/0.0035 99.44/0.55 99.67/0.58 99.17/0.79 PSO 1.00/0 100/0 100/0 99.83/0.35 Table 2. Comparison of MLT cummulative mode result between GA and PSO. MLT overlap Recognition accuracy (knn) CM q o train (%) test (%) test-loo (%) mean/std mean/std mean/std mean/std GA 0.9878/0.0044 98.89/0.36 99.50/6.36 98.67/1.48 PSO 0.9953/0.0024 99.10/0.34 99.92/0.89 99.83/0.35 showed slightly better performances than Gaussian windowing with regarding to classification accuracies and more efficient in computation effort. The MLT techniques use the summation operator and the number of features only depend on the number of thresholdings used, whereas the Gaussian windowing relies on the multiplication operator and the number of features depends on the number of kernels used and also the temperature steps. In the next step after applying Gaussian windowing feature computation, automated feature selection method was carried out to improve the results of overlap measurement and accuracy in classification as shown in Table 4. Due to the fact that optimization of feature computation and feature selection applies the filter approach, that satisfies the overlap measurement as a criterion function instead of directly using the classification, it can happen that the recognition rates of testing set can be higher than the training set as shown in Table 3 and 4. 5 Conclusion In this paper, we contribute to the optimization of the feature computation step in the overall automated design of intelligent sensor systems, investigating multi-level thresholding and Gaussian windowing, which are optimized by evolutionary techniques, i.e., genetic algorithms and particle swarm optimization. The results in the experiments show that PSO performed better and even superior than GA. According to the overlap measurement from filter approach and the classification accuracy, the multi-level thresholding techniques achieved superior results and more efficient computation effort than Gaussian windowing. The next processing step of dimensionality reduction (feature selection) proved the improvement of the recognition system. In future work, we will consider applications of sensor networks and develop library/toolbox of feature computation techniques for multi-sensor system design in-

Comparison of PSO-Based Optimized Feature Computation 9 tended to increase the choice of the feature processing and apply feature level fusion in order to advance the recognition accuracy of sensor systems. Also, additional feature assessment functions according to wrapper and filter approaches [10] and [11] and the combination of them in multi-objective optimization problems [12] and [13] for assessing feature computation and feature selection with regarding to automated design of intelligent sensor systems will be considered. Acknowledgment The providing of the gas sensor benchmark data [1] and [6] by Thomas Fricke, Marc Baumbach and Andreas Schuetze from Laboratory for Measurement Technology, University of Saarland is gratefully acknowledged. References 1. Iswandy, K., Koenig, A., Fricke, T., Baumbach, M., Schuetze, A.: Towards Automated Configuration of Multi-Sensor Systems Using Evolutionary Computation - A Method and a Case Study. J. Computational and Theoretical Nanoscience, Vol. 2. No. 4. American Scientific Publishers (2005), 574 582. 2. Courte, D. E., Rizki, M. M., Tamburino, L. A., Gutierrez-Osuna, R.: Evolutionary Optimization of Gaussian Windowing Functions for Data Preprocessing. Int. J. Artificial Intelligence Tools, Vol. 12. No. 1. World Scientific (2003) 17 35. 3. Goldberg, D. E.: Genetic Algorithms in Search, Optimization, and Machine Learning. Addison-Wesley, Reading, MA. 4. Kennedy, J., Eberhart, R.C.: Particle Swarm Optimization. Proc. of IEEE Int. Conf. on Neural Networks (ICNN). Vol. 4. (1995) 1942 1948. 5. Kennedy, J., Eberhart, R. C.: A Discrete Binary Version of The Particle Swarm Algorithm. Proc. of Conf. on System, Man, and Cybernetics. (1997) 4104 4109. 6. Baumbach, M., Sossong, A., Delprat, H., Soulantica, K., Schuetze, A., Borrel, H., Maisonnat, A., Chaudret, B.: New Micro Machined Gas Sensors Combined with Intelligent Signal Processing Allowing Fast Gas Identification after Power-Up. Proceedings Sensor 2005. Vol. 2. 91 96. Table 3. Results of Gaussian window function using PSO. no. of overlap Recognition accuracy (knn) kernel ( 10) q o train (%) test (%) test-loo (%) mean/std mean/std mean/std mean/std 3 0.9806/0.0044 97.91/0.65 99.00/0.66 95.50/2.29 4 0.9791/0.0081 97.78/0.79 99.00/0.77 95.83/1.76 5 0.9794/0.0021 98.13/0.03 99.67/0.43 96.08/1.11 6 0.9797/0.0034 97.71/0.74 98.75/0.90 94.92/2.17 7 0.9795/0.0015 98.13/0.57 99.25/0.73 96.92/1.11 8 0.9786/0.0027 97.92/0.46 99.00/0.53 95.67/0.95 9 0.9786/0.0031 97.92/0.46 99.08/0.61 95.83/1.36 10 0.9787/0.0016 98.13/0.47 99.75/0.40 96.08/0.88

10 Kuncup Iswandy and Andreas Koenig Table 4. Results of Gaussian window function after applying feature selection. no. of overlap selected Recognition accuracy (knn) kernel (x 10) q o feature train (%) test (%) test-loo (%) 3 0.9822 10 99.31 100 99.17 4 0.9854 10 98.61 100 98.33 5 0.9835 36 99.31 100 98.33 6 0.9844 26 99.31 100 97.50 7 0.9805 29 99.31 100 98.33 8 0.9889 34 99.31 100 99.17 9 0.9841 41 99.31 99.17 98.33 10 0.9859 50 99.31 100 98.33 7. Koenig, A., Gratz, A.: Advanced Methods for the Analysis of Semiconductor Manufacturing Process Data. In: Pal, N. R., Jain, L. C. (eds.): Advanced Techniques in Knowledge Discovery and Data Mining. Springer Verlag (2005) 27 74. 8. Eberhart, R. C., Shi, Y.: Comparison between Genetic Algorithms and Particle Swarm Optimization. In V. W. Porto, N. Saravanan, D. Waagen, and A. E. Eiben, Eds. Evolutionary Programming VII: Proc. 7th Ann. Conf., San Diego, CA. Berlin, Springer Verlag. 9. Raymer, M.L., Punch, W.F., Goodman, E.D., Kuhn, L.A., Jain, A.K.: Dimensionality Reduction Using Genetic Algorithms. IEEE Trans. Evolutionary Computation. Vol. 4. No. 2. (2000) 164 171. 10. Liu, H., Motoda, H.: Feature Selection for Knowledge Discovery and Data Mining. Kluwer Academic Publisher. (1998). 11. Mao, K.Z.: Fast Orthogonal Forward Selection Algorithm for Feature Subset Selection. IEEE Trans. Neural Networks. (2002) 1218 1224. 12. Emmanouilidis, C., Hunter, A., MacIntyre, J.: A Multiobjective Evolutionary for Feature Selection and a Commonality-Based Crossover Operator. In 2000 Congress on Evolutionary Computation (CEC 2000). IEEE Service Center. (2000). 13. Iswandy, K., Koenig, A.: Feature Selection with Acquisition Cost for Optimizing Sensor System Design. Accepted in Kleinheubacher Tagung, KH2005, C.1, Integrierte digitale und analoge Schaltungen. Miltenberg, Germany. (2005).