An Experimental Multi-Objective Study of the SVM Model Selection problem
|
|
- Jeffrey Skinner
- 5 years ago
- Views:
Transcription
1 An Experimental Multi-Objective Study of the SVM Model Selection problem Giuseppe Narzisi Courant Institute of Mathematical Sciences New York, NY 10012, USA Abstract. Support Vector machines (SVMs) are a powerful method for both regression and classification. However, any SVM formulation requires the user to set two or more parameters which govern the training process and such parameters can have a strong effect on the result performance of the engine. Moreover, the design of learning systems is inherently a multi-objective optimization problem. It requires to find a suitable trade-off between at least two conflicting objectives: model complexity and accuracy. In this work the SVM model selection problem is cast as a multi-objective optimization problem, where the and the number of support vectors of the model define the two objectives. Experimental analysis is presented on a well known test-bed of datasets using two different kernels: RBF and sigmoid. Key words: Support Vector Machine, Multi-Objective optimization, NSGA-II, SVM Model Selection. 1 Introduction Support Vector Machines have been proven to be very effective methods for classification and regression [12]. However, in order to obtain good generalization s the user needs to choose appropriate values for the involved parameters of the model. The kernel parameters together with the regularization parameter C, are called hyperparameters of the SVM, and the problem of tuning them in order, for example, to improve the generalization of the model is called SVM model selection problem. Usually the standard method to determine the hyperparameter is by grid search. In the simple grid-search approach the hyperparameters are varied with a fixed step-size through a wide range of values and the performance of every combination is measured. Because of its computational complexity, grid-search is only suitable for the adjustment of very few parameters. Further, the choice of the discretization of the search space may be crucial. Figure 1 shows the typical parameter surface for the and the number of support vectors as a function of the hyperparameters C and γ for the diabetes dataset. Recently gradient-based approaches have been explored for choosing the hyperparameters[2, 6, 8]. However they have some drawbacks and limitations. First of all, the score function to evaluate the quality of a set
2 2 Giuseppe Narzisi of hyperparameters must be differentiable, which excludes important important measure, as the number of support vectors. Also because the objective function is strongly multimodal, the performance of the gradient-based heuristic depend on the initialization, which means that the algorithm can easily get stuck in a sub-optimal local minima. # of SVs C gamma C gamma (a) (b) Fig.1. Parameter surface of the (a) and the number of SVs (b) as a function of the two hyperparameters C and γ for the diabetes dataset using 5-fold cross-validation. The main idea which is missing in this kind of approaches is that the SVM model selection problem is inherently a multi-objective optimization problem. Designing supervised learning systems for classification requires finding a suitable trade-off between several objectives. Typically we want to reduce the complexity of the model and at the same time obtaining a model with high accuracy level (or low rate). Sometimes having a model with the best generalization could not be the best choice if the price that we have to pay is working with a very complex model both in terms of time and space. Usually this problem is tackled aggregating the objectives into a scalar function (linear weighting of the objectives) and applying standards method to the resulting single objective optimization problem. However, it has been shown how this approach is not a good solution because it requires that the aggregate function correctly matches the problem, and this is not an easy task. The best solution is to apply directly the multi-objective approach in order to find the Pareto optimal set of solutions for the problem. Among the many possible approaches to solve a multi-objective optimization problem, the last decade has seen Multi-objective Evolutionary Algorithms (MOEA) as the emerging method in this area. Successful application have been already obtained in the machine learning area in the case of feature selection for SVMs [9, 10]. Similar experiments to the ones presented in this paper has been proposed in [7] where the split modified radius margin bounds and the training were used in conjunction with the number of SVs. The experiments presented in this work differ from that approach in many ways: 1) the impact of different kernels
3 A multi-objective analysis of Support Vector Machines 3 is analyzed; 2) the simple straightforward 2-objective formulation is considered (num. of SVs and CV ) before any additional sophistication; 3) the standard NSGA-II algorithms is used instead of the NSES algorithm proposed in [7]; 4) the is evaluated using the 5-fold cross-validation method. There are many reasons for using a multi-objective evolutionary approach for SVM model selection: ability to obtain in one run, not just a single model but several model which are optimal (in the Pareto sense) respect to the selected objectives or criteria; the best SVM model can be selected later from the Pareto front according to some higher level information or preferences; multiple hyperparameters can be tuned at the same time overcoming the limitation of the naive grid-search method; the objective/criteria do not need to be differentiable (as required for the gradient-based methods); efficient exploration of the multimodal search space associated with the parameters. The goal of this research work is to show the effectiveness of this approach for SVM model selection using a very simple 2-objective formulation which takes into account the complexity and the accuracy of the model. The paper is organized as follow. We first introduce SVMs and SVM model selection from the perspective of multi-objective optimization. Then we give the background on multi-objective optimization. Then we introduce the class of multi-objective evolutionary algorithms. Section 5 reports the results obtained on a test bed of four datasets widely used in the literature. Finally, the conclusions are presented and possible future line of investigation are given. 2 Multi-objective view of SVM The first evidence of the multi-objective nature of SVMs is directly related to their standard formulation in the inconsistent case, the so called C-SVM formulation: 1 min 2 w 2 + C m i ξ i (1) subject to y i [w x i + b] 1 ξ i, ξ i 0, i [1, m] where C is the the regularization parameter which determines the trade-off between the margin and the sum of the slack variables m i ξ i. The constant C is usually determined using same heuristic approach. However, the more natural formulation of the problem is the following: 1 min 2 w 2 min m i ξ i subject to y i [w x i + b] 1 ξ i, ξ i 0, i [1, m] (2) where the objective in (1) is split in two different conflicting objectives, overcoming the problem of determining the parameter C. Even if this formulation
4 4 Giuseppe Narzisi is more natural than (1), not so much effort on this problem is present in the literature. It would be interesting to analyze this problem using the theoretical approach presented by Mihalis Yannakakis in [13] where he discusses the condition under which an approximate trade-off curve can be constructed efficiently (in polynomial time). The multi-objective nature of SVM training is also present at the level of model selection. The typical criteria of evaluation for a classifier is given by the accuracy of the model in classifying new generated points, and this metric is often used alone in order to select/generate good classifiers. However there are many other important factors that must be taken into account when selecting a SVM model. A possible (not exhaustive) list is the following: number of input features; bound on the generalization (e.g., radius margin bound); number of support vectors. In this paper we consider the last one, number of SVs, as an additional selection criteria. 3 Multi-Objective Optimization When an optimization problem involves more then a single-valued objective function, the task of finding one (or more) optimum solution(s), is known as the Multi-Objective Optimization Problem (MOOP) [4]. An optimum solution with respect to one objective may not be optimum with respect to another objective. As a consequence, one cannot choose a solution, which is optimal with respect to only one objective. In problems characterized by more than one conflicting objective, there is no single optimum solution; instead there exists a set of solutions which are all optimal, called the Optimal Pareto front. A general multi-objective optimization problem is defined as follows (minimization case): min F(x) = [f 1 (x), f 2 (x),..., f M (x)] subject to E(x) = [e 1 (x), e 2 (x),..., e L (x)] 0 x i x (U) i, i = 1,...,N, x (L) i (3) where x = (x 1, x 2,...,x N ) is the vector of the N decision variables, M is the number of objectives f i, L is the number of constraints e j, and x (L) i and x (U) i are respectively the lower and upper bound for each decision variables x i. Two different solutions are compared using the concept of dominance, which induces a strict partial order in the objective space F. Here a solution a is said to dominate a solution b if it is better or equal in all objectives and better in at least one objective. For the minimization case we have: F(a) F(b) iff { fi (a) f i (b) i 1,..., M j 1,..., M f j (a) < f j (b) (4)
5 A multi-objective analysis of Support Vector Machines 5 In the specific case of the SVM model selection, we have that the hyperparameters are the decision variables of the problem, the range of exploration for each parameters are the bounds for each decision variable, and the model selection criteria are the objectives (no constraints are used in this formulation). 4 Method 4.1 Model selection metrics As discussed in section 2 there are many criteria that can be used for SVM model selection. In this section we introduce the two objectives that have been used for the simulations. Accuracy. The most direct way to evaluate the quality of a SVM model is to consider its classification performance (accuracy). In the simple case the data is split into a training and a validation set. The first set is used to generate the SVM model, the second set is used to evaluate the performance of the classifier. In this work we use the more general approach called L-fold cross-validation (CV). The data is partitioned into L disjoint sets D 1, D 2,..., D L and the SVM is trained L times on all data but the D i set which is used later as validation data. The accuracy (or ) is computed as the mean of the L different experiments. For reasons of computational complexity we use a 5-fold CV for each dataset. Number of support vectors. We know that the in the hard margin case the number of SVs is an upper bound on the expected number of s made by the leave-one-out procedure. Moreover the space and time complexity of the SVM classifier scales with the number of SVs. It follows that it is important to have a SVM model which has few number of support vector (SVs). Similarly to the 5-fold CV, the number of SVs is computed as the mean on the 5 different experiments of the CV method. 4.2 Multi-Objective Evolutionary Algorithms Evolutionary algorithms (EAs) are search methods that take their inspiration from natural selection and survival of the fittest in the biological world. EAs differ from more traditional optimization techniques in that they involve a search from a population of solutions, not from a single point. Each iteration of an EA involves a competitive selection that weeds out poor solutions. The solutions with high fitness are recombined with other solutions by swapping parts of a solution with another. Solutions are also mutated by making a small change to a single element of the solution. Recombination and mutation are used to generate new solutions that are biased towards regions of the space for which good solutions have already been seen. Multi-Objective Evolutionary Algorithms (MOEAs) are a special class of EAs with the goal of solving problems involving many conflicting objectives [4].
6 6 Giuseppe Narzisi LIBSVM library Test on new data Hyperparameters and mean number of SVs on 5 fold cross validation Decision Making phase SVM model selection NSGA II (Multi Objective Evolutionary Algorithm) Population Evolution Output Pareto fronts (trade off curve) Fig.2. NSGA-II and LIBSVM pipeline. Over the last decade, a steady stream of MOEAS has continued to be proposed and studied [4, 3]. MOEAs have been successfully applied to several real world problems (protein folding, circuit design, safety related systems, etc) even if no strong proof of convergence is available. Among the growing class of MOEAs, in this work we employ the well-known NSGA-II [5] (Nondominated Sorting Genetic Algorithm II). NSGA-II is based on the use of fast nondominated sorting approach to sort a population of solutions into different nondomination levels. It then uses elitism and a crowded-comparison operator for diversity preservation. Table 1. Benchmark datasets. Name Size Features Repository diabetes UCI australian Statlog german 1, Statlog splice 1, Delve 5 Results 5.1 Experiments In this research work we deal with the standard application of SVM for binary classification. We used a common benchmark of four datasets (table 1 shows the characteristics of the datasets). We consider two different kernel and their parameters: RBF (radial basis function): K(u,v) = exp( γ u v 2 ) Sigmoid: K(u,v) = tanh(γu T v + coef 0 )
7 A multi-objective analysis of Support Vector Machines 7 It follows that the hyperparameters considered will be respectively (C, γ) for the RBF kernel and (C, γ, coef 0 ) for the sigmoid kernel. The parameter ranges are: log 2 C [ 5,...,15], log 2 γ [ 10,...,4], coef 0 [0, 1]. According to the values suggested in [5], the NSGA-II parameters are set as follow: p c = 0.9, p m = 0.1, ν c = 10, ν m = 20. No effort has been spent in this work to tune these parameter, which clearly would improve the efficiency of the algorithm. A population size of 60 individuals is used and each simulation is curried out for a total of 250 generations. Each plot shows the Pareto fronts (trade-off curves) of all the points (SVM models) sampled by the algorithm after the first 50 generations. As it is described later, 50 iterations are enough to converge versus the final approximated Pareto front. SVMs are constructed using the LIBSVM 1 library [1] version Figure 2 shows the interaction between NSGA-II and LIBSVM library (a) RBF (b) Sigmoid num. of SVs (c) RBF num. of SVs (d) Sigmoid Fig. 3. Diabetes dataset: Pareto front of the sampled points using RBF (a) and sigmoid (b) kernel; mean evolution of the population for the and the number of SVs during the optimization of NSGA-II using RBF kernel (c) and sigmoid (d) kernel. 1 cjlin/libsvm
8 8 Giuseppe Narzisi 5.2 Discussion Figures 3, 4, 5 and 6 show the results obtained using the experimental protocol previously defined. Inspecting the results we observe, first of all, that approximate Pareto fronts are effectively obtained for each of the datasets showing how the two used objectives present a conflict behavior. This is also evident from the analysis of the evolution curves: an improvement of one objective is nearly always accompanied by a worsening in the other, but the interaction during the evolution produces a global minimization of both objectives. The choice of the kernel clearly affects the final outcome of the optimization algorithm. In particular the RBF kernel shows a better performance than the sigmoid kernel. Inspecting the Pareto fronts obtained we note that the RBF kernel allows to obtain a better distribution of solution along the two objectives. This is an important factor in multi-objective optimization: we want Pareto fronts with a wide range of values so that the selection of a final point in the second step (decision making) is facilitated (a) RBF (b) Sigmoid (c) RBF (d) Sigmoid Fig. 4. Australian dataset: Pareto front of the sampled points using RBF (a) and sigmoid (b) kernel; mean evolution of the population for the and the number of SVs during the optimization of NSGA-II using RBF kernel (c) and sigmoid (d) kernel.
9 A multi-objective analysis of Support Vector Machines 9 For each dataset we also plot the mean evolution curves for the and the number of support vectors for the population of SVM models at each iteration. Inspecting the plots we observe that the algorithm generally converges very quickly to a set of good SVM models (first 50 iterations). It then uses the rest of the time to explore locally the space of solution for an additional finer refinement. If we compare the accuracy of the SVM models obtained using this method with other approaches in the literature we find comparable results. For example the best obtained for the diabetes dataset with this approach is 21.7 while the obtained by Keerthi in [8], Chapelle in [2] and Staelin in [11] are respectively 24.33, and Similarly for the splice dataset we obtain an of 12.4 while the obtained by by Keerthi in [8] and Staelin in [11] are respectively and (a) RBF (b) Sigmoid num. of SVs (c) RBF num. of SVs (d) Sigmoid Fig.5. Splice dataset: Pareto front of the sampled points using RBF (a) and sigmoid (b) kernel; mean evolution of the population for the and the number of SVs during the optimization of NSGA-II using RBF kernel (c) and sigmoid (d) kernel. An important advance of this approach is that together with good models, in terms of accuracy, the algorithm generate also many other models with different number of support vectors which are relevant in case that the complexity of the
10 10 Giuseppe Narzisi final model is an important factor for the final model selection. For example, in the case of the splice dataset, we could be happy to lose same degree of accuracy, and select a solution with an of 14% instead of 12%, in favor of a model that has a much lower complexity, 370 SVs instead of 570 (see figure 5) (a) RBF (b) Sigmoid num. of SVs (c) RBF (d) Sigmoid Fig.6. German dataset: Pareto front of the sampled points using RBF (a) and sigmoid (b) kernel; mean evolution of the population for the and the number of SVs during the optimization of NSGA-II using RBF kernel (c) and sigmoid (d) kernel. 6 Conclusions and possible future investigations The SVM model selection problem clearly presents the characteristics of a multiobjective optimization problem. The results in this experimental work have shown that it is possible to effectively obtain approximated Pareto fronts of SVM models based on a simple 2-objective formulation where the accuracy and the complexity of the model are compared for Pareto dominance. This approach allows to visualize the characteristic trade-off curve for a specific dataset from where the user can select a specific model according to its own preferences and computational needs.
11 A multi-objective analysis of Support Vector Machines 11 The proposed method also allows to obtain comparable results to other approaches in the literature but with the advance that as set of Pareto optimal solutions (not a single one) is generated in output. Of course a deeper investigation is required and many different line of investigation can be considered : extending the formulation from 2-objectives to possibly k-objective (k > 2) including many other important criteria of model selection (like for example the number of input features); studying the performance of the proposed approach on the regression case; adapting the approach to the multi-classification case where it is harder to choose appropriate values for the base binary models of a decomposition scheme. References 1. Chih-Chung Chang and Chih-Jen Lin. LIBSVM: a library for support vector machines, Software available at 2. Olivier Chapelle, Vladimir Vapnik, Olivier Bousquet, and Sayan Mukherjee. Choosing multiple parameters for support vector machines. Machine Learning, 46(1-3): , Coello Coello and G. B. Lamont. Applications of Multi-Objective Evolutionary Algorithms. World Scientific, Kalyanmoy Deb. Multi-Objective Optimization Using Evolutionary Algorithms. John Wiley & Sons, Inc., New York, NY, USA, Kalyanmoy Deb, Samir Agrawal, Amrit Pratap, and T. Meyarivan. A fast and elitist multiobjective genetic algorithm: Nsga-II. IEEE Trans. Evolutionary Computation, 6(2): , Tobias Glasmachers and Christian Igel. Gradient-based adaptation of general gaussian kernels. Neural Comput., 17(10): , Christian Igel. Multi-objective model selection for support vector machines. Evolutionary Multi-Criterion Optimization, pages , S.S. Keerthi. Efficient tuning of svm hyperparameters using radius/margin bound and iterative algorithms. IEEE Transactions on Neural Networks, 13: , S. Pang and N. Kasabov. Inductive vs. transductive inference, global vs. local models: Svm, tsvm, and svmt for gene expression classification problems. International Joint Conference on Neual Networks (IJCNN), 2: , S.Y.M. Shi, P.N. Suganthan, and K. Deb. Multi-class protein fold recognition using multi-objective evolutionary algorithms. IEEE Symposium on Computational Intelligence in Bioinformatics and Computational Biology, pages 61 66, Carl Staelin. Parameter selection for support vector machines. HP Labs Technical Reports, Vladimir N. Vapnik. The nature of statistical learning theory. Springer-Verlag New York, Inc., New York, NY, USA, Mihalis Yannakakis. Approximation of multiobjective optimization problems. Algorithms and Data Structures : 7th International Workshop, pages 1, 2001.
Efficient Tuning of SVM Hyperparameters Using Radius/Margin Bound and Iterative Algorithms
IEEE TRANSACTIONS ON NEURAL NETWORKS, VOL. 13, NO. 5, SEPTEMBER 2002 1225 Efficient Tuning of SVM Hyperparameters Using Radius/Margin Bound and Iterative Algorithms S. Sathiya Keerthi Abstract This paper
More informationA Practical Guide to Support Vector Classification
A Practical Guide to Support Vector Classification Chih-Wei Hsu, Chih-Chung Chang, and Chih-Jen Lin Department of Computer Science and Information Engineering National Taiwan University Taipei 106, Taiwan
More informationMulti-Objective Optimization for SVM Model Selection
Multi-Objective Optimization for SVM Model Selection C. Chatelain, S. Adam, Y. Lecourtier, L. Heutte, T. Paquet Laboratoire LITIS, Université de Rouen, Avenue de l université, 76800 Saint Etienne du Rouvray,
More informationMulti-Objective Optimization using Evolutionary Algorithms
Multi-Objective Optimization using Evolutionary Algorithms Kalyanmoy Deb Department of Mechanical Engineering, Indian Institute of Technology, Kanpur, India JOHN WILEY & SONS, LTD Chichester New York Weinheim
More informationCombining SVMs with Various Feature Selection Strategies
Combining SVMs with Various Feature Selection Strategies Yi-Wei Chen and Chih-Jen Lin Department of Computer Science, National Taiwan University, Taipei 106, Taiwan Summary. This article investigates the
More informationDynamic Ensemble Construction via Heuristic Optimization
Dynamic Ensemble Construction via Heuristic Optimization Şenay Yaşar Sağlam and W. Nick Street Department of Management Sciences The University of Iowa Abstract Classifier ensembles, in which multiple
More informationMulti-objective Optimization
Some introductory figures from : Deb Kalyanmoy, Multi-Objective Optimization using Evolutionary Algorithms, Wiley 2001 Multi-objective Optimization Implementation of Constrained GA Based on NSGA-II Optimization
More informationMulti-Objective Optimization using Evolutionary Algorithms
Multi-Objective Optimization using Evolutionary Algorithms Kalyanmoy Deb Department ofmechanical Engineering, Indian Institute of Technology, Kanpur, India JOHN WILEY & SONS, LTD Chichester New York Weinheim
More informationUsing ɛ-dominance for Hidden and Degenerated Pareto-Fronts
IEEE Symposium Series on Computational Intelligence Using ɛ-dominance for Hidden and Degenerated Pareto-Fronts Heiner Zille Institute of Knowledge and Language Engineering University of Magdeburg, Germany
More informationImproving Performance of Multi-objective Genetic for Function Approximation through island specialisation
Improving Performance of Multi-objective Genetic for Function Approximation through island specialisation A. Guillén 1, I. Rojas 1, J. González 1, H. Pomares 1, L.J. Herrera 1, and B. Paechter 2 1) Department
More informationMechanical Component Design for Multiple Objectives Using Elitist Non-Dominated Sorting GA
Mechanical Component Design for Multiple Objectives Using Elitist Non-Dominated Sorting GA Kalyanmoy Deb, Amrit Pratap, and Subrajyoti Moitra Kanpur Genetic Algorithms Laboratory (KanGAL) Indian Institute
More informationMultiobjective Formulations of Fuzzy Rule-Based Classification System Design
Multiobjective Formulations of Fuzzy Rule-Based Classification System Design Hisao Ishibuchi and Yusuke Nojima Graduate School of Engineering, Osaka Prefecture University, - Gakuen-cho, Sakai, Osaka 599-853,
More informationSecond Order SMO Improves SVM Online and Active Learning
Second Order SMO Improves SVM Online and Active Learning Tobias Glasmachers and Christian Igel Institut für Neuroinformatik, Ruhr-Universität Bochum 4478 Bochum, Germany Abstract Iterative learning algorithms
More informationMulti-objective Model Selection for Support Vector Machines
Multi-objective Model Selection for Support Vector Machines Christian Igel Institute for Neurocomputing Ruhr-University Bochum 44780 Bochum, Germany christian.igel@neuroinformatik.rub.de Abstract. In this
More informationMulti-objective Optimization
Jugal K. Kalita Single vs. Single vs. Single Objective Optimization: When an optimization problem involves only one objective function, the task of finding the optimal solution is called single-objective
More informationAn Evolutionary Algorithm for the Multi-objective Shortest Path Problem
An Evolutionary Algorithm for the Multi-objective Shortest Path Problem Fangguo He Huan Qi Qiong Fan Institute of Systems Engineering, Huazhong University of Science & Technology, Wuhan 430074, P. R. China
More informationEvolutionary Algorithms: Lecture 4. Department of Cybernetics, CTU Prague.
Evolutionary Algorithms: Lecture 4 Jiří Kubaĺık Department of Cybernetics, CTU Prague http://labe.felk.cvut.cz/~posik/xe33scp/ pmulti-objective Optimization :: Many real-world problems involve multiple
More informationLamarckian Repair and Darwinian Repair in EMO Algorithms for Multiobjective 0/1 Knapsack Problems
Repair and Repair in EMO Algorithms for Multiobjective 0/ Knapsack Problems Shiori Kaige, Kaname Narukawa, and Hisao Ishibuchi Department of Industrial Engineering, Osaka Prefecture University, - Gakuen-cho,
More informationMulti-objective Optimization and Meta-learning for SVM Parameter Selection
Multi-objective Optimization and Meta-learning for SVM Parameter Selection Péricles B. C. Miranda Ricardo B. C. Prudêncio Andre Carlos P. L. F. de Carvalho Carlos Soares Federal University of PernambucoFederal
More informationExperimental Study on Bound Handling Techniques for Multi-Objective Particle Swarm Optimization
Experimental Study on Bound Handling Techniques for Multi-Objective Particle Swarm Optimization adfa, p. 1, 2011. Springer-Verlag Berlin Heidelberg 2011 Devang Agarwal and Deepak Sharma Department of Mechanical
More informationMultiobjective RBFNNs Designer for Function Approximation: An Application for Mineral Reduction
Multiobjective RBFNNs Designer for Function Approximation: An Application for Mineral Reduction Alberto Guillén, Ignacio Rojas, Jesús González, Héctor Pomares, L.J. Herrera and Francisco Fernández University
More informationA Practical Guide to Support Vector Classification
Support Vector Machines 1 A Practical Guide to Support Vector Classification Chih-Jen Lin Department of Computer Science National Taiwan University Talk at University of Freiburg, July 15, 2003 Support
More informationLeave-One-Out Support Vector Machines
Leave-One-Out Support Vector Machines Jason Weston Department of Computer Science Royal Holloway, University of London, Egham Hill, Egham, Surrey, TW20 OEX, UK. Abstract We present a new learning algorithm
More informationPerformance Assessment of DMOEA-DD with CEC 2009 MOEA Competition Test Instances
Performance Assessment of DMOEA-DD with CEC 2009 MOEA Competition Test Instances Minzhong Liu, Xiufen Zou, Yu Chen, Zhijian Wu Abstract In this paper, the DMOEA-DD, which is an improvement of DMOEA[1,
More informationEvolutionary Computation
Evolutionary Computation Lecture 9 Mul+- Objec+ve Evolu+onary Algorithms 1 Multi-objective optimization problem: minimize F(X) = ( f 1 (x),..., f m (x)) The objective functions may be conflicting or incommensurable.
More informationA gradient-based multiobjective optimization technique using an adaptive weighting method
10 th World Congress on Structural and Multidisciplinary Optimization May 19-24, 2013, Orlando, Florida, USA A gradient-based multiobjective optimization technique using an adaptive weighting method Kazuhiro
More informationMulti-objective optimization of support vector machines
In: Yaochu Jin (Ed.), Multi-objective Machine Learning Studies in Computational Intelligence, Vol. 16, pp. 199-220, Springer-Verlag, 2006 Multi-objective optimization of support vector machines Thorsten
More informationCan Support Vector Machine be a Major Classification Method?
Support Vector Machines 1 Can Support Vector Machine be a Major Classification Method? Chih-Jen Lin Department of Computer Science National Taiwan University Talk at Max Planck Institute, January 29, 2003
More informationA Similarity-Based Mating Scheme for Evolutionary Multiobjective Optimization
A Similarity-Based Mating Scheme for Evolutionary Multiobjective Optimization Hisao Ishibuchi and Youhei Shibata Department of Industrial Engineering, Osaka Prefecture University, - Gakuen-cho, Sakai,
More informationDEMO: Differential Evolution for Multiobjective Optimization
DEMO: Differential Evolution for Multiobjective Optimization Tea Robič and Bogdan Filipič Department of Intelligent Systems, Jožef Stefan Institute, Jamova 39, SI-1000 Ljubljana, Slovenia tea.robic@ijs.si
More informationKernel Methods and Visualization for Interval Data Mining
Kernel Methods and Visualization for Interval Data Mining Thanh-Nghi Do 1 and François Poulet 2 1 College of Information Technology, Can Tho University, 1 Ly Tu Trong Street, Can Tho, VietNam (e-mail:
More informationReference Point Based Evolutionary Approach for Workflow Grid Scheduling
Reference Point Based Evolutionary Approach for Workflow Grid Scheduling R. Garg and A. K. Singh Abstract Grid computing facilitates the users to consume the services over the network. In order to optimize
More informationRecombination of Similar Parents in EMO Algorithms
H. Ishibuchi and K. Narukawa, Recombination of parents in EMO algorithms, Lecture Notes in Computer Science 341: Evolutionary Multi-Criterion Optimization, pp. 265-279, Springer, Berlin, March 25. (Proc.
More informationCHAPTER 6 REAL-VALUED GENETIC ALGORITHMS
CHAPTER 6 REAL-VALUED GENETIC ALGORITHMS 6.1 Introduction Gradient-based algorithms have some weaknesses relative to engineering optimization. Specifically, it is difficult to use gradient-based algorithms
More informationCHAPTER 2 CONVENTIONAL AND NON-CONVENTIONAL TECHNIQUES TO SOLVE ORPD PROBLEM
20 CHAPTER 2 CONVENTIONAL AND NON-CONVENTIONAL TECHNIQUES TO SOLVE ORPD PROBLEM 2.1 CLASSIFICATION OF CONVENTIONAL TECHNIQUES Classical optimization methods can be classified into two distinct groups:
More informationBalancing Survival of Feasible and Infeasible Solutions in Evolutionary Optimization Algorithms
Balancing Survival of Feasible and Infeasible Solutions in Evolutionary Optimization Algorithms Zhichao Lu,, Kalyanmoy Deb, and Hemant Singh Electrical and Computer Engineering Michigan State University,
More informationMechanical Component Design for Multiple Objectives Using Elitist Non-Dominated Sorting GA
Mechanical Component Design for Multiple Objectives Using Elitist Non-Dominated Sorting GA Kalyanmoy Deb, Amrit Pratap, and Subrajyoti Moitra Kanpur Genetic Algorithms Laboratory (KanGAL) Indian Institute
More informationEvaluation of Performance Measures for SVR Hyperparameter Selection
Evaluation of Performance Measures for SVR Hyperparameter Selection Koen Smets, Brigitte Verdonk, Elsa M. Jordaan Abstract To obtain accurate modeling results, it is of primal importance to find optimal
More informationMulti-Objective Optimization Using Genetic Algorithms
Multi-Objective Optimization Using Genetic Algorithms Mikhail Gaerlan Computational Physics PH 4433 December 8, 2015 1 Optimization Optimization is a general term for a type of numerical problem that involves
More informationBagging and Boosting Algorithms for Support Vector Machine Classifiers
Bagging and Boosting Algorithms for Support Vector Machine Classifiers Noritaka SHIGEI and Hiromi MIYAJIMA Dept. of Electrical and Electronics Engineering, Kagoshima University 1-21-40, Korimoto, Kagoshima
More informationImproving interpretability in approximative fuzzy models via multi-objective evolutionary algorithms.
Improving interpretability in approximative fuzzy models via multi-objective evolutionary algorithms. Gómez-Skarmeta, A.F. University of Murcia skarmeta@dif.um.es Jiménez, F. University of Murcia fernan@dif.um.es
More informationComparison of Evolutionary Multiobjective Optimization with Reference Solution-Based Single-Objective Approach
Comparison of Evolutionary Multiobjective Optimization with Reference Solution-Based Single-Objective Approach Hisao Ishibuchi Graduate School of Engineering Osaka Prefecture University Sakai, Osaka 599-853,
More informationDiscovering Knowledge Rules with Multi-Objective Evolutionary Computing
2010 Ninth International Conference on Machine Learning and Applications Discovering Knowledge Rules with Multi-Objective Evolutionary Computing Rafael Giusti, Gustavo E. A. P. A. Batista Instituto de
More informationProject Presentation. Pattern Recognition. Under the guidance of Prof. Sumeet Agar wal
Project Presentation in Pattern Recognition Under the guidance of Prof. Sumeet Agar wal By- ABHISHEK KUMAR (2009CS10175) GAURAV AGARWAL (2009EE10390) Aim Classification of customers based on their attributes
More informationMulti-Objective Pipe Smoothing Genetic Algorithm For Water Distribution Network Design
City University of New York (CUNY) CUNY Academic Works International Conference on Hydroinformatics 8-1-2014 Multi-Objective Pipe Smoothing Genetic Algorithm For Water Distribution Network Design Matthew
More informationClassification by Support Vector Machines
Classification by Support Vector Machines Florian Markowetz Max-Planck-Institute for Molecular Genetics Computational Molecular Biology Berlin Practical DNA Microarray Analysis 2003 1 Overview I II III
More informationClassification by Support Vector Machines
Classification by Support Vector Machines Florian Markowetz Max-Planck-Institute for Molecular Genetics Computational Molecular Biology Berlin Practical DNA Microarray Analysis 2003 1 Overview I II III
More informationGood Cell, Bad Cell: Classification of Segmented Images for Suitable Quantification and Analysis
Cell, Cell: Classification of Segmented Images for Suitable Quantification and Analysis Derek Macklin, Haisam Islam, Jonathan Lu December 4, 22 Abstract While open-source tools exist to automatically segment
More informationApproximation-Guided Evolutionary Multi-Objective Optimization
Approximation-Guided Evolutionary Multi-Objective Optimization Karl Bringmann 1, Tobias Friedrich 1, Frank Neumann 2, Markus Wagner 2 1 Max-Planck-Institut für Informatik, Campus E1.4, 66123 Saarbrücken,
More informationClassification by Support Vector Machines
Classification by Support Vector Machines Florian Markowetz Max-Planck-Institute for Molecular Genetics Computational Molecular Biology Berlin Practical DNA Microarray Analysis 2003 1 Overview I II III
More informationApproximation Model Guided Selection for Evolutionary Multiobjective Optimization
Approximation Model Guided Selection for Evolutionary Multiobjective Optimization Aimin Zhou 1, Qingfu Zhang 2, and Guixu Zhang 1 1 Each China Normal University, Shanghai, China 2 University of Essex,
More informationMulti-Objective Memetic Algorithm using Pattern Search Filter Methods
Multi-Objective Memetic Algorithm using Pattern Search Filter Methods F. Mendes V. Sousa M.F.P. Costa A. Gaspar-Cunha IPC/I3N - Institute of Polymers and Composites, University of Minho Guimarães, Portugal
More informationMulti-objective Optimization Algorithm based on Magnetotactic Bacterium
Vol.78 (MulGrab 24), pp.6-64 http://dx.doi.org/.4257/astl.24.78. Multi-obective Optimization Algorithm based on Magnetotactic Bacterium Zhidan Xu Institute of Basic Science, Harbin University of Commerce,
More informationDynamic Uniform Scaling for Multiobjective Genetic Algorithms
Dynamic Uniform Scaling for Multiobjective Genetic Algorithms Gerulf K. M. Pedersen 1 and David E. Goldberg 2 1 Aalborg University, Department of Control Engineering, Fredrik Bajers Vej 7, DK-922 Aalborg
More informationReihe Informatik 10/2001. Efficient Feature Subset Selection for Support Vector Machines. Matthias Heiler, Daniel Cremers, Christoph Schnörr
Computer Vision, Graphics, and Pattern Recognition Group Department of Mathematics and Computer Science University of Mannheim D-68131 Mannheim, Germany Reihe Informatik 10/2001 Efficient Feature Subset
More informationIncorporation of Scalarizing Fitness Functions into Evolutionary Multiobjective Optimization Algorithms
H. Ishibuchi, T. Doi, and Y. Nojima, Incorporation of scalarizing fitness functions into evolutionary multiobjective optimization algorithms, Lecture Notes in Computer Science 4193: Parallel Problem Solving
More informationOptimizing Delivery Time in Multi-Objective Vehicle Routing Problems with Time Windows
Optimizing Delivery Time in Multi-Objective Vehicle Routing Problems with Time Windows Abel Garcia-Najera and John A. Bullinaria School of Computer Science, University of Birmingham Edgbaston, Birmingham
More informationSoftDoubleMinOver: A Simple Procedure for Maximum Margin Classification
SoftDoubleMinOver: A Simple Procedure for Maximum Margin Classification Thomas Martinetz, Kai Labusch, and Daniel Schneegaß Institute for Neuro- and Bioinformatics University of Lübeck D-23538 Lübeck,
More informationThe k-means Algorithm and Genetic Algorithm
The k-means Algorithm and Genetic Algorithm k-means algorithm Genetic algorithm Rough set approach Fuzzy set approaches Chapter 8 2 The K-Means Algorithm The K-Means algorithm is a simple yet effective
More informationEffectiveness and efficiency of non-dominated sorting for evolutionary multi- and many-objective optimization
Complex Intell. Syst. (217) 3:247 263 DOI 1.17/s4747-17-57-5 ORIGINAL ARTICLE Effectiveness and efficiency of non-dominated sorting for evolutionary multi- and many-objective optimization Ye Tian 1 Handing
More informationChoosing the kernel parameters for SVMs by the inter-cluster distance in the feature space Authors: Kuo-Ping Wu, Sheng-De Wang Published 2008
Choosing the kernel parameters for SVMs by the inter-cluster distance in the feature space Authors: Kuo-Ping Wu, Sheng-De Wang Published 2008 Presented by: Nandini Deka UH Mathematics Spring 2014 Workshop
More informationEvolutionary multi-objective algorithm design issues
Evolutionary multi-objective algorithm design issues Karthik Sindhya, PhD Postdoctoral Researcher Industrial Optimization Group Department of Mathematical Information Technology Karthik.sindhya@jyu.fi
More informationTowards Understanding Evolutionary Bilevel Multi-Objective Optimization Algorithm
Towards Understanding Evolutionary Bilevel Multi-Objective Optimization Algorithm Ankur Sinha and Kalyanmoy Deb Helsinki School of Economics, PO Box, FIN-, Helsinki, Finland (e-mail: ankur.sinha@hse.fi,
More informationThe Effects of Outliers on Support Vector Machines
The Effects of Outliers on Support Vector Machines Josh Hoak jrhoak@gmail.com Portland State University Abstract. Many techniques have been developed for mitigating the effects of outliers on the results
More informationBi-Objective Optimization for Scheduling in Heterogeneous Computing Systems
Bi-Objective Optimization for Scheduling in Heterogeneous Computing Systems Tony Maciejewski, Kyle Tarplee, Ryan Friese, and Howard Jay Siegel Department of Electrical and Computer Engineering Colorado
More informationEvolutionary Optimization of Neural Networks for Face Detection
Evolutionary Optimization of Neural Networks for Face Detection Stefan Wiegand Christian Igel Uwe Handmann Institut für Neuroinformatik, Ruhr-Universität Bochum, 44780 Bochum, Germany Viisage Technology
More informationAllstate Insurance Claims Severity: A Machine Learning Approach
Allstate Insurance Claims Severity: A Machine Learning Approach Rajeeva Gaur SUNet ID: rajeevag Jeff Pickelman SUNet ID: pattern Hongyi Wang SUNet ID: hongyiw I. INTRODUCTION The insurance industry has
More informationTime Complexity Analysis of the Genetic Algorithm Clustering Method
Time Complexity Analysis of the Genetic Algorithm Clustering Method Z. M. NOPIAH, M. I. KHAIRIR, S. ABDULLAH, M. N. BAHARIN, and A. ARIFIN Department of Mechanical and Materials Engineering Universiti
More informationUsing Different Many-Objective Techniques in Particle Swarm Optimization for Many Objective Problems: An Empirical Study
International Journal of Computer Information Systems and Industrial Management Applications ISSN 2150-7988 Volume 3 (2011) pp.096-107 MIR Labs, www.mirlabs.net/ijcisim/index.html Using Different Many-Objective
More informationMULTI-OBJECTIVE GENETIC LOCAL SEARCH ALGORITHM FOR SUPPLY CHAIN SIMULATION OPTIMISATION
MULTI-OBJECTIVE GENETIC LOCAL SEARCH ALGORITHM FOR SUPPLY CHAIN SIMULATION OPTIMISATION Galina Merkuryeva (a), Liana Napalkova (b) (a) (b) Department of Modelling and Simulation, Riga Technical University,
More informationHYBRID GENETIC ALGORITHM WITH GREAT DELUGE TO SOLVE CONSTRAINED OPTIMIZATION PROBLEMS
HYBRID GENETIC ALGORITHM WITH GREAT DELUGE TO SOLVE CONSTRAINED OPTIMIZATION PROBLEMS NABEEL AL-MILLI Financial and Business Administration and Computer Science Department Zarqa University College Al-Balqa'
More informationVersion Space Support Vector Machines: An Extended Paper
Version Space Support Vector Machines: An Extended Paper E.N. Smirnov, I.G. Sprinkhuizen-Kuyper, G.I. Nalbantov 2, and S. Vanderlooy Abstract. We argue to use version spaces as an approach to reliable
More informationLecture Set 1B. S.D. Sudhoff Spring 2010
Lecture Set 1B More Basic Tools S.D. Sudhoff Spring 2010 1 Outline Time Domain Simulation (ECE546, MA514) Basic Methods for Time Domain Simulation MATLAB ACSL Single and Multi-Objective Optimization (ECE580)
More informationImproving Generalization of Radial Basis Function Network with Adaptive Multi-Objective Particle Swarm Optimization
Proceedings of the 009 IEEE International Conference on Systems, Man, and Cybernetics San Antonio, TX, USA - October 009 Improving Generalization of Radial Basis Function Network with Adaptive Multi-Obective
More informationDCMOGADES: Distributed Cooperation model of Multi-Objective Genetic Algorithm with Distributed Scheme
: Distributed Cooperation model of Multi-Objective Genetic Algorithm with Distributed Scheme Tamaki Okuda, Tomoyuki HIROYASU, Mitsunori Miki, Jiro Kamiura Shinaya Watanabe Department of Knowledge Engineering,
More informationRobust 1-Norm Soft Margin Smooth Support Vector Machine
Robust -Norm Soft Margin Smooth Support Vector Machine Li-Jen Chien, Yuh-Jye Lee, Zhi-Peng Kao, and Chih-Cheng Chang Department of Computer Science and Information Engineering National Taiwan University
More informationMAXIMUM LIKELIHOOD ESTIMATION USING ACCELERATED GENETIC ALGORITHMS
In: Journal of Applied Statistical Science Volume 18, Number 3, pp. 1 7 ISSN: 1067-5817 c 2011 Nova Science Publishers, Inc. MAXIMUM LIKELIHOOD ESTIMATION USING ACCELERATED GENETIC ALGORITHMS Füsun Akman
More informationFace Recognition using SURF Features and SVM Classifier
International Journal of Electronics Engineering Research. ISSN 0975-6450 Volume 8, Number 1 (016) pp. 1-8 Research India Publications http://www.ripublication.com Face Recognition using SURF Features
More informationFinding Sets of Non-Dominated Solutions with High Spread and Well-Balanced Distribution using Generalized Strength Pareto Evolutionary Algorithm
16th World Congress of the International Fuzzy Systems Association (IFSA) 9th Conference of the European Society for Fuzzy Logic and Technology (EUSFLAT) Finding Sets of Non-Dominated Solutions with High
More informationOnline Mathematical Symbol Recognition using SVMs with Features from Functional Approximation
Online Mathematical Symbol Recognition using SVMs with Features from Functional Approximation Birendra Keshari and Stephen M. Watt Ontario Research Centre for Computer Algebra Department of Computer Science
More informationScienceDirect. Differential Search Algorithm for Multiobjective Problems
Available online at www.sciencedirect.com ScienceDirect Procedia Computer Science 48 (2015 ) 22 28 International Conference on Intelligent Computing, Communication & Convergence (ICCC-2015) (ICCC-2014)
More informationX/$ IEEE
IEEE TRANSACTIONS ON EVOLUTIONARY COMPUTATION, VOL. 12, NO. 1, FEBRUARY 2008 41 RM-MEDA: A Regularity Model-Based Multiobjective Estimation of Distribution Algorithm Qingfu Zhang, Senior Member, IEEE,
More informationA Taxonomy of Semi-Supervised Learning Algorithms
A Taxonomy of Semi-Supervised Learning Algorithms Olivier Chapelle Max Planck Institute for Biological Cybernetics December 2005 Outline 1 Introduction 2 Generative models 3 Low density separation 4 Graph
More informationMIXED VARIABLE ANT COLONY OPTIMIZATION TECHNIQUE FOR FEATURE SUBSET SELECTION AND MODEL SELECTION
MIXED VARIABLE ANT COLONY OPTIMIZATION TECHNIQUE FOR FEATURE SUBSET SELECTION AND MODEL SELECTION Hiba Basim Alwan 1 and Ku Ruhana Ku-Mahamud 2 1, 2 Universiti Utara Malaysia, Malaysia, hiba81basim@yahoo.com,
More informationEvolutionary Algorithms and the Cardinality Constrained Portfolio Optimization Problem
Evolutionary Algorithms and the Cardinality Constrained Portfolio Optimization Problem Felix Streichert, Holger Ulmer, and Andreas Zell Center for Bioinformatics Tübingen (ZBIT), University of Tübingen,
More informationAssessing the Convergence Properties of NSGA-II for Direct Crashworthiness Optimization
10 th International LS-DYNA Users Conference Opitmization (1) Assessing the Convergence Properties of NSGA-II for Direct Crashworthiness Optimization Guangye Li 1, Tushar Goel 2, Nielen Stander 2 1 IBM
More informationSTUDY OF MULTI-OBJECTIVE OPTIMIZATION AND ITS IMPLEMENTATION USING NSGA-II
STUDY OF MULTI-OBJECTIVE OPTIMIZATION AND ITS IMPLEMENTATION USING NSGA-II A THESIS SUBMITTED IN PARTIAL FULFILLMENT OF THE REQUIREMENTS FOR THE DEGREE OF Bachelor of Technology in Electrical Engineering.
More informationHandling Multi Objectives of with Multi Objective Dynamic Particle Swarm Optimization
Handling Multi Objectives of with Multi Objective Dynamic Particle Swarm Optimization Richa Agnihotri #1, Dr. Shikha Agrawal #1, Dr. Rajeev Pandey #1 # Department of Computer Science Engineering, UIT,
More informationData Mining in Bioinformatics Day 1: Classification
Data Mining in Bioinformatics Day 1: Classification Karsten Borgwardt February 18 to March 1, 2013 Machine Learning & Computational Biology Research Group Max Planck Institute Tübingen and Eberhard Karls
More informationMulti-objective optimization using Trigonometric mutation multi-objective differential evolution algorithm
Multi-objective optimization using Trigonometric mutation multi-objective differential evolution algorithm Ashish M Gujarathi a, Ankita Lohumi, Mansi Mishra, Digvijay Sharma, B. V. Babu b* a Lecturer,
More informationParallel Evaluation of Hopfield Neural Networks
Parallel Evaluation of Hopfield Neural Networks Antoine Eiche, Daniel Chillet, Sebastien Pillement and Olivier Sentieys University of Rennes I / IRISA / INRIA 6 rue de Kerampont, BP 818 2232 LANNION,FRANCE
More informationA Parameterless-Niching-Assisted Bi-objective Approach to Multimodal Optimization
A Parameterless-Niching-Assisted Bi-objective Approach to Multimodal Optimization Sunith Bandaru and Kalyanmoy Deb Kanpur Genetic Algorithms Laboratory Indian Institute of Technology Kanpur Kanpur 86,
More informationAn Evolutionary Multi-Objective Crowding Algorithm (EMOCA): Benchmark Test Function Results
Syracuse University SURFACE Electrical Engineering and Computer Science College of Engineering and Computer Science -0-005 An Evolutionary Multi-Objective Crowding Algorithm (EMOCA): Benchmark Test Function
More informationA Nelder-Mead Tuner for Svm
A Nelder-Mead Tuner for Svm prepared by: Kester Smith approved by: reference: issue: 1 revision: 0 date: 2009-03-13 status: Draft Abstract Read at your own risk, as this is a working document and has not
More informationSoftware Documentation of the Potential Support Vector Machine
Software Documentation of the Potential Support Vector Machine Tilman Knebel and Sepp Hochreiter Department of Electrical Engineering and Computer Science Technische Universität Berlin 10587 Berlin, Germany
More informationSVM Toolbox. Theory, Documentation, Experiments. S.V. Albrecht
SVM Toolbox Theory, Documentation, Experiments S.V. Albrecht (sa@highgames.com) Darmstadt University of Technology Department of Computer Science Multimodal Interactive Systems Contents 1 Introduction
More informationNeural Network Regularization and Ensembling Using Multi-objective Evolutionary Algorithms
Neural Network Regularization and Ensembling Using Multi-objective Evolutionary Algorithms Yaochu Jin Honda Research Institute Europe Carl-Legien-Str 7 Offenbach, GERMANY Email: yaochujin@honda-ride Tatsuya
More informationSSV Criterion Based Discretization for Naive Bayes Classifiers
SSV Criterion Based Discretization for Naive Bayes Classifiers Krzysztof Grąbczewski kgrabcze@phys.uni.torun.pl Department of Informatics, Nicolaus Copernicus University, ul. Grudziądzka 5, 87-100 Toruń,
More informationSPEA2+: Improving the Performance of the Strength Pareto Evolutionary Algorithm 2
SPEA2+: Improving the Performance of the Strength Pareto Evolutionary Algorithm 2 Mifa Kim 1, Tomoyuki Hiroyasu 2, Mitsunori Miki 2, and Shinya Watanabe 3 1 Graduate School, Department of Knowledge Engineering
More informationFinding Knees in Multi-objective Optimization
Finding Knees in Multi-objective Optimization Jürgen Branke 1, Kalyanmoy Deb 2, Henning Dierolf 1, and Matthias Osswald 1 1 Institute AIFB, University of Karlsruhe, Germany branke@aifb.uni-karlsruhe.de
More information