A Classifier with the Function-based Decision Tree

Size: px
Start display at page:

Download "A Classifier with the Function-based Decision Tree"

Transcription

1 A Classifier with the Function-based Decision Tree Been-Chian Chien and Jung-Yi Lin Institute of Information Engineering I-Shou University, Kaohsiung 84008, Taiwan, R.O.C Abstract Classification is one of the important problems in the research area of knowledge discovery and machine learning. In this paper, an accurate classifier with multi-category based on the genetic programming is proposed. The classifier consists of the discriminant functions that are generated by genetic programming. We propose the function-based decision tree (FDT) to resolve the problem of ambiguity between discriminant functions, and the experiments show that the proposed methods are accurate. (Keywords: Knowledge discovery, Machine learning, Genetic programming, Classification, Decision Tree) 1. Introduction Classification plays an important role in knowledge discovery and machine learning. Many applications can be viewed as extensions of classification problem. For example, pattern recognition, disease diagnosis, and business decision-making. To reduce the training time and increase the classification rate, many different methods have been proposed for building an efficient and effective classifier in the past decades. For example, the Bayesian decision theory [5], neural network [17], k-nearest neighbor [2], genetic algorithm (GA) [15][16] and genetic programming (GP) [4][6]. In genetic programming, the task of classification is accomplished by a set of discriminant functions[6] or classification rules. Classifying by discriminant functions has more advantages because they are concise, efficient, and the number of functions is fewer. However, the ambiguity may occur when an obect is recognized by two or more discriminant functions at the same time or an obect is not recognized by all functions, thus a complete classification approach should provide a refined mechanism used to resolve the problems. In this paper, we proposed a method to resolve the ambiguity: the function-based decision tree (FDT). In this paper, two well-known datasets are selected to show the performance of the proposed classifier: the Fisher s Iris dataset and Wisconsin Breast Cancer dataset. The remainder of this paper is organized as follows. Section 2 briefly reviews the genetic programming. In Section 3, we propose the distance-based fitness function of GP. In Section 4, we propose the FDT. The experimental results are described in Section 5. Finally, conclusions are made in Section GENETIC PROGRAMMING The technique of genetic programming (GP) was proposed by Koza [7][8]. Genetic programming can discover underlying relationships between data and presents such

2 relationships by expressions. The expression is constructed by possible terminals and functions. Several types of functions can be applied within genetic programming, for example, arithmetic operations and trigonometric functions. Genetic programming begins with a set called population. The population consists of randomly created individuals. Each individual is an expression that stands for a potential solution. A fitness function is given for evaluating the fitness of each individual. For the period of a generation, individuals are modified to get better fitness by genetic operators. In the end of a generation, a new population contains individuals with better fitness will replace the original one. The modifications of individuals are made by genetic operators: reproduction, crossover, and mutation [1]. After the evolution of a number of generations, an individual with best fitness value can be taken as the solution. 3. The Learning of Discriminant Functions based on Genetic Programming 3.1 A Formal Description of Classification Problem Given a data set S, for each data x S has n attributes. Let x be represented as x = (v 1, v 2,, v t,, v n ), 1 t n, where v t R stands for the t-th attribute of x. Let C = {C 1, C 2,..., C K } be the set of K predefined classes, we say that <x, c > is a sample if the obect x has been assigned to a specified class c, c C. We then define a training set (T) to be a set of samples, T = {<x, c > x = (v 1, v 2,..., v n ), c C, 1 m} where m is the number of samples in T, and T = m. We denote the number of samples in T belonging to the class C i as m i, 1 i K. A discriminant function f i is a function mapping from R n to R. For a sample <x, c >, the functions should satisfy the following conditions, fi ) a,, where 1 i K, 1 m. fi ) < a, A set of discriminant functions F is defined by follows, F = { f i f i : R n R,1 i K}. 3.2 The Distance-Based Fitness Function In the learning procedure, we first prepare the training set T, which includes positive instances and negative instances [11]. The learning procedure can be started after the training set T is prepared[1][11]. We denote an individual in GP as h i. Consider a discriminant function f i for a class C i, we give a specified constant a and urge that f i (x ) a for the positive instance and f i (x ) a for negative instance. Two parameters p and q are defined as p > a and q < a. For the individual h i, the fitness measure for a positive instance is defined to be 0 ) a Dp, c ) = 2, (1) [ p hi )] ) < a and the fitness measure for a negative instance is defined to be 2 [ hi ) q] ) a Dn, c ) =. (2) 0 ) < a The fitness value of an individual h i for the training set T is defined as m Fitness(h i (x ), T)= ( Dp, c) + Dn, c)). (3) = 1 where <x, c > T, 1 m. Since we use the negative of measure to be the fitness value

3 of an individual, the best fitness value is zero. 4. The Function-based Classifiers In this paper, we proposed the FDT method to resolve the two ambiguous problems. In FDT, each node stands for a discriminant function. An obect through the FDT will be taken out from the dataset if it has recognized by a node, therefore, the ambiguity is avoided. In the FDT, we generate only K-1 discriminant functions for a K-class problem. Since there are obects not recognized by f K-1, they will be classified into the class C K. To construct the FDT, a good permutation of the nodes is important to perform accurate classification. The performance of a node is measured by precision and recall[1]. Higher precision means lower misclassification rate gher recall means higher recognition rate. We now define the permutation of the FDT. At first, from the classification results of the training obects with the generated K-1 discriminant functions, the precision and recall of the discriminant function f i can be evaluated and denoted as <p i, r i >. We define the permutation of discriminant functions as f (1), f (2),, f (i), f (i+1),, f (K-1), where f (i) stands for a function f i with <p i, r i > and p i > p i+1, or p i = p i+1 and r i r i+1, otherwise. 5. EXPERIMENTAL RESULTS In experiments, we modify the GP Quick 2.1[14] to fit the requirements of proposed approaches and perform the experiments. The test datasets are the Fisher s Iris dataset (IRIS)[3] and Wisconsin Breast Cancer dataset (WBCD)[12]. The accuracy of the classification results is evaluated by average accuracy and overall accuracy [1]. The average accuracy stands for the average recognition rate of discriminant functions. The overall accuracy presents the performance of recognition for a classifier. In Table 1, we show the parameters of genetic programming used in the two experiments Fisher s Iris In Iris [3], there are 150 data separated into three classes: Setosa, Versicolor and Virginica. Each class contains 50 data that has four numerical attributes. We denote the attributes as SL, SW, PL and PW, respectively. We randomly select 25 data from each class to construct the training set. After the learning procedure, three discriminant functions are generated and are shown as following: f Setosa = SW-PL, f Versicolor = ((((((((((((((PL + PL) - (-33 / PL)) / -22) - 11) / (SW - PL)) * 99) - 121) - PW) * PL) / -31) + 45) / -21) - PW) / 43), f Virginica = ((((PW * PL) + (-11 / PW)) - ((((-26-92) / (PW * 59)) / -7) / PL))/SL). The classification results of training set and test set by using above three functions are shown in Table 2. The permutation of discriminant functions in the FDT can be achieved: f (1) = f Setosa, f (2) = f Versicolor. The classification result of FDT is shown in Table 2. We found that only one obect causes misclassification. The average accuracy and overall accuracy are high. Finally, we compare with the results of previous researches in Table Wisconsin Breast Cancer dataset WBCD [12] contains 699 data that separated into two classes, Malignant and Benign.

4 Each data in WBCD has 9 numerical attributes. However, there are 16 data have missing values, thus we use the remainder 683 data to accomplish this experiment, where the Malignant contains 239 data and the Benign contains 444 data. The training set is constructed by 119 data from Malignant and 222 data from Benign. The remainder data are used to be the test set. The 9 attributes of a sample are denoted as F1, F2,, F9, which are used to be the terminals in GP. We show the classification results of training set and test set in Table 4. From Table 4, the permutation of the two discriminant functions can be obtains as f (1) =f Benign, f (2) =f Malignant. The classification results of FDT are shown in Table 4. Finally, we compared the previous classified results with the proposed methods in Table 5. Table 1: Parameters used in the experiments. Parameter Value Parameter Value Node mutate weight 43.5% Mutation weight annealing 40% Mutate constant weight 43.5% Population size 1000 Mutate shrink weight 13% Generations in Iris 5000 Selection method Tournament Generations in WBCD Mutation weight 8% Crossover weight 28% Terminal set in Iris SL, SW, PL, PW Crossover weight annealing 20% Terminal set in WBCD F1, F2 F9 Function set +,,, Table 2: The classification results. Method Classification with the discriminant functions Classification with FDT Dataset Training set Test set Test set Assigned class Setosa Versicolor Virginica Setosa Versicolor Virginica Setosa Versicolor Virginica Setosa Versicolor Virginica Precision 100% 100% 88.9% 100% 100% 89.3% 100% 100% 96.2% Recall 100% 80.0% 96.0% 100% 96.0% 100% 100% 96.0% 100% Average 98.6% Overall 98.6% Table 3: The accuracy comparison of IRIS dataset. Models or Methods Recognition Rate GPCE[6] 96.0% FUNLVQ+GFENCE [9] 96.3% FEBFC with 4 features [10] 96.7% FEBFC with 2 selected features[10] 97.1% FDT 98.6% Table 4: The classification results of WBCD. Method Classification with the discriminant functions Classification with FDT Dataset Training set Test set Test set Assigned class Malignant Benign Malignant Benign Malignant Benign Malignant Benign Precision 95.93% 99.54% 95.16% 98.63% 95.12% 98.63% Recall 99.16% 96.85% 98.33% 97.30% 97.50% 97.30% Average 97.40% Overall 97.37%

5 Table 5: The comparison of classification results on WBCD. Models or methods Recognition rate C4.5 (cited from [10]) 93.1% NNFS with 9 features [13] 93.94% NNFS with avg features [13] 94.15% FEBFC with 9 features [10] 94.67% FEBFC with selected 6 features [10] 95.14% FDT 97.37% 6. CONCLUSIONS This paper presents an efficient distance-based fitness function of genetic programming is proposed to learn efficient discriminant functions, and two methods are proposed to resolve ambiguity, the FDT classification approach. The experimental results show that the proposed method performs high accuracy. However, the attributes of the obects in the dataset are all numerical. The categorical attribute seems to have no simple way to be applied for classification by discriminant functions. The future work on text classification based on discriminant functions is worth to be investigated. REFERENCES [1] B. C. Chien and J. Y. Lin, Learning Discriminating Functions Based on Genetic Programming for Classification, in Proc. TAAI2001, Taiwan (2001) [2] R. O. Duda, P. E. Hart, Pattern Classification and Scene Analysis, John Wiley & Sons, New York, [3] R. A. Fisher, The Use of Multiple Measurements in Taxonomic Problems, Ann. Eugenics, pt. II 7 (1936) [4] A. A. Freitas, A Genetic Programming Framework for Two Data Mining Tasks: Classification and Generalized Rule Induction, in Proc. 2 nd Annual Conference Morgan Kaufmann (1997) [5] D. Heckerman and M. P. Wellman, Bayesian networks, Communications of the ACM 38, No. 3 (1995) [6] J. K. Kishore et al., Application of Genetic Programming for Multicategory Pattern Classification, IEEE Trans. on Evolutionary Computation 4, No. 3, (2000) [7] J. R. Koza, Genetic Programming: On the Programming of Computers by Means of Natural Selection, MIT Press, Cambridge, [8] J. R. Koza et al. (eds.), Genetic Programming 1996, MIT Press, Cambridge, [9] H. M. Lee, A Neural Network Classifier with Disunctive Fuzzy Information, Neural Networks 11, No. 6 (1998) [10] H. M. Lee et al., An Efficient Fuzzy Classifier with Feature Selection Based on Fuzzy Entropy, IEEE Trans. on Systems, Man, and Cybernetics-part b: Cybernetics 31, No. 3, (2001) [11] J. Y. Lin and B. C. Chien, Learning Classification Rules With Fuzzy Attributes Using Genetic Programming, in Proc. of Ninth National Conference on Fuzzy Theory and Its Applications, Taiwan (2001) [12] O. L. Mangasarian and W. H. Wolberg, Cancer Diagnosis via Linear Programming, SIAM News 23, No. 5 (1990) [13] R. Setiono et al., Neural-Network Feature Selector, IEEE Trans. on Neural Networks 8 (1997) [14] A. Singleton, Genetic Programming with C++, Byte (Feb 1994) [15] C. H. Wang et al., A Fuzzy Inductive Learning Strategy for Modular Rules, Fuzzy Set and Systems 103 (1999) [16] C. H. Wang et al., Integrating Fuzzy Knowledge by Genetic Algorithms, IEEE Trans. on Evolutionary Computation 2, No. 4 (1998) [17] G. P. Zhang, Neural Networks for Classification: A Survey, IEEE Trans. on Systems, Man, and Cybernetics-Part C: Applications and Reviews, 30, No. 4 (2000)

A New Approach for Handling the Iris Data Classification Problem

A New Approach for Handling the Iris Data Classification Problem International Journal of Applied Science and Engineering 2005. 3, : 37-49 A New Approach for Handling the Iris Data Classification Problem Shyi-Ming Chen a and Yao-De Fang b a Department of Computer Science

More information

Derivation of Relational Fuzzy Classification Rules Using Evolutionary Computation

Derivation of Relational Fuzzy Classification Rules Using Evolutionary Computation Derivation of Relational Fuzzy Classification Rules Using Evolutionary Computation Vahab Akbarzadeh Alireza Sadeghian Marcus V. dos Santos Abstract An evolutionary system for derivation of fuzzy classification

More information

Processing Missing Values with Self-Organized Maps

Processing Missing Values with Self-Organized Maps Processing Missing Values with Self-Organized Maps David Sommer, Tobias Grimm, Martin Golz University of Applied Sciences Schmalkalden Department of Computer Science D-98574 Schmalkalden, Germany Phone:

More information

FEATURE GENERATION USING GENETIC PROGRAMMING BASED ON FISHER CRITERION

FEATURE GENERATION USING GENETIC PROGRAMMING BASED ON FISHER CRITERION FEATURE GENERATION USING GENETIC PROGRAMMING BASED ON FISHER CRITERION Hong Guo, Qing Zhang and Asoke K. Nandi Signal Processing and Communications Group, Department of Electrical Engineering and Electronics,

More information

1. Introduction. International IEEE multi topics Conference (INMIC 2005), Pakistan, Karachi, Dec. 2005

1. Introduction. International IEEE multi topics Conference (INMIC 2005), Pakistan, Karachi, Dec. 2005 Combining Nearest Neighborhood Classifiers using Genetic Programming Abdul Majid, Asifullah Khan and Anwar M. Mirza Faculty of Computer Science & Engineering, GIK Institute, Ghulam Ishaq Khan (GIK) Institute

More information

Classification with Diffuse or Incomplete Information

Classification with Diffuse or Incomplete Information Classification with Diffuse or Incomplete Information AMAURY CABALLERO, KANG YEN Florida International University Abstract. In many different fields like finance, business, pattern recognition, communication

More information

Argha Roy* Dept. of CSE Netaji Subhash Engg. College West Bengal, India.

Argha Roy* Dept. of CSE Netaji Subhash Engg. College West Bengal, India. Volume 3, Issue 3, March 2013 ISSN: 2277 128X International Journal of Advanced Research in Computer Science and Software Engineering Research Paper Available online at: www.ijarcsse.com Training Artificial

More information

Evaluating the SVM Component in Oracle 10g Beta

Evaluating the SVM Component in Oracle 10g Beta Evaluating the SVM Component in Oracle 10g Beta Dept. of Computer Science and Statistics University of Rhode Island Technical Report TR04-299 Lutz Hamel and Angela Uvarov Department of Computer Science

More information

Image Classification and Processing using Modified Parallel-ACTIT

Image Classification and Processing using Modified Parallel-ACTIT Proceedings of the 2009 IEEE International Conference on Systems, Man, and Cybernetics San Antonio, TX, USA - October 2009 Image Classification and Processing using Modified Parallel-ACTIT Jun Ando and

More information

2002 Journal of Software.. (stacking).

2002 Journal of Software.. (stacking). 1000-9825/2002/13(02)0245-05 2002 Journal of Software Vol13, No2,,, (,200433) E-mail: {wyji,ayzhou,zhangl}@fudaneducn http://wwwcsfudaneducn : (GA) (stacking), 2,,, : ; ; ; ; : TP18 :A, [1],,, :,, :,,,,

More information

Preprocessing of Stream Data using Attribute Selection based on Survival of the Fittest

Preprocessing of Stream Data using Attribute Selection based on Survival of the Fittest Preprocessing of Stream Data using Attribute Selection based on Survival of the Fittest Bhakti V. Gavali 1, Prof. Vivekanand Reddy 2 1 Department of Computer Science and Engineering, Visvesvaraya Technological

More information

Classification with Diffuse or Incomplete Information

Classification with Diffuse or Incomplete Information Classification with Diffuse or Incomplete Information AMAURY CABALLERO, KANG YEN, YECHANG FANG Department of Electrical & Computer Engineering Florida International University 10555 W. Flagler Street,

More information

Evolving SQL Queries for Data Mining

Evolving SQL Queries for Data Mining Evolving SQL Queries for Data Mining Majid Salim and Xin Yao School of Computer Science, The University of Birmingham Edgbaston, Birmingham B15 2TT, UK {msc30mms,x.yao}@cs.bham.ac.uk Abstract. This paper

More information

Comparative Study of Clustering Algorithms using R

Comparative Study of Clustering Algorithms using R Comparative Study of Clustering Algorithms using R Debayan Das 1 and D. Peter Augustine 2 1 ( M.Sc Computer Science Student, Christ University, Bangalore, India) 2 (Associate Professor, Department of Computer

More information

CONCEPT FORMATION AND DECISION TREE INDUCTION USING THE GENETIC PROGRAMMING PARADIGM

CONCEPT FORMATION AND DECISION TREE INDUCTION USING THE GENETIC PROGRAMMING PARADIGM 1 CONCEPT FORMATION AND DECISION TREE INDUCTION USING THE GENETIC PROGRAMMING PARADIGM John R. Koza Computer Science Department Stanford University Stanford, California 94305 USA E-MAIL: Koza@Sunburn.Stanford.Edu

More information

Performance Analysis of Data Mining Classification Techniques

Performance Analysis of Data Mining Classification Techniques Performance Analysis of Data Mining Classification Techniques Tejas Mehta 1, Dr. Dhaval Kathiriya 2 Ph.D. Student, School of Computer Science, Dr. Babasaheb Ambedkar Open University, Gujarat, India 1 Principal

More information

Genetic Fuzzy Discretization with Adaptive Intervals for Classification Problems

Genetic Fuzzy Discretization with Adaptive Intervals for Classification Problems Genetic Fuzzy Discretization with Adaptive Intervals for Classification Problems Yoon-Seok Choi School of Computer Science & Engineering, Seoul National University Shillim-dong, Gwanak-gu, Seoul, 151-742,

More information

Genetic Programming for Classification

Genetic Programming for Classification > Genetic Programming for Classification< 1 Genetic Programming for Classification Narendra S. Chaudhari, Aruna Tiwari and Anuradha Purohit Abstract This paper presents an approach for designing classifiers

More information

Multi-objective pattern and feature selection by a genetic algorithm

Multi-objective pattern and feature selection by a genetic algorithm H. Ishibuchi, T. Nakashima: Multi-objective pattern and feature selection by a genetic algorithm, Proc. of Genetic and Evolutionary Computation Conference (Las Vegas, Nevada, U.S.A.) pp.1069-1076 (July

More information

Machine Learning: Algorithms and Applications Mockup Examination

Machine Learning: Algorithms and Applications Mockup Examination Machine Learning: Algorithms and Applications Mockup Examination 14 May 2012 FIRST NAME STUDENT NUMBER LAST NAME SIGNATURE Instructions for students Write First Name, Last Name, Student Number and Signature

More information

DATA MINING INTRODUCTION TO CLASSIFICATION USING LINEAR CLASSIFIERS

DATA MINING INTRODUCTION TO CLASSIFICATION USING LINEAR CLASSIFIERS DATA MINING INTRODUCTION TO CLASSIFICATION USING LINEAR CLASSIFIERS 1 Classification: Definition Given a collection of records (training set ) Each record contains a set of attributes and a class attribute

More information

The k-means Algorithm and Genetic Algorithm

The k-means Algorithm and Genetic Algorithm The k-means Algorithm and Genetic Algorithm k-means algorithm Genetic algorithm Rough set approach Fuzzy set approaches Chapter 8 2 The K-Means Algorithm The K-Means algorithm is a simple yet effective

More information

A PSO-based Generic Classifier Design and Weka Implementation Study

A PSO-based Generic Classifier Design and Weka Implementation Study International Forum on Mechanical, Control and Automation (IFMCA 16) A PSO-based Generic Classifier Design and Weka Implementation Study Hui HU1, a Xiaodong MAO1, b Qin XI1, c 1 School of Economics and

More information

Revision of a Floating-Point Genetic Algorithm GENOCOP V for Nonlinear Programming Problems

Revision of a Floating-Point Genetic Algorithm GENOCOP V for Nonlinear Programming Problems 4 The Open Cybernetics and Systemics Journal, 008,, 4-9 Revision of a Floating-Point Genetic Algorithm GENOCOP V for Nonlinear Programming Problems K. Kato *, M. Sakawa and H. Katagiri Department of Artificial

More information

Research Article Path Planning Using a Hybrid Evolutionary Algorithm Based on Tree Structure Encoding

Research Article Path Planning Using a Hybrid Evolutionary Algorithm Based on Tree Structure Encoding e Scientific World Journal, Article ID 746260, 8 pages http://dx.doi.org/10.1155/2014/746260 Research Article Path Planning Using a Hybrid Evolutionary Algorithm Based on Tree Structure Encoding Ming-Yi

More information

Introduction to Artificial Intelligence

Introduction to Artificial Intelligence Introduction to Artificial Intelligence COMP307 Machine Learning 2: 3-K Techniques Yi Mei yi.mei@ecs.vuw.ac.nz 1 Outline K-Nearest Neighbour method Classification (Supervised learning) Basic NN (1-NN)

More information

An Expert System for Detection of Breast Cancer Using Data Preprocessing and Bayesian Network

An Expert System for Detection of Breast Cancer Using Data Preprocessing and Bayesian Network Vol. 34, September, 211 An Expert System for Detection of Breast Cancer Using Data Preprocessing and Bayesian Network Amir Fallahi, Shahram Jafari * School of Electrical and Computer Engineering, Shiraz

More information

Genetic Programming for Data Classification: Partitioning the Search Space

Genetic Programming for Data Classification: Partitioning the Search Space Genetic Programming for Data Classification: Partitioning the Search Space Jeroen Eggermont jeggermo@liacs.nl Joost N. Kok joost@liacs.nl Walter A. Kosters kosters@liacs.nl ABSTRACT When Genetic Programming

More information

FUZZY KERNEL K-MEDOIDS ALGORITHM FOR MULTICLASS MULTIDIMENSIONAL DATA CLASSIFICATION

FUZZY KERNEL K-MEDOIDS ALGORITHM FOR MULTICLASS MULTIDIMENSIONAL DATA CLASSIFICATION FUZZY KERNEL K-MEDOIDS ALGORITHM FOR MULTICLASS MULTIDIMENSIONAL DATA CLASSIFICATION 1 ZUHERMAN RUSTAM, 2 AINI SURI TALITA 1 Senior Lecturer, Department of Mathematics, Faculty of Mathematics and Natural

More information

WEIGHTED K NEAREST NEIGHBOR CLASSIFICATION ON FEATURE PROJECTIONS 1

WEIGHTED K NEAREST NEIGHBOR CLASSIFICATION ON FEATURE PROJECTIONS 1 WEIGHTED K NEAREST NEIGHBOR CLASSIFICATION ON FEATURE PROJECTIONS 1 H. Altay Güvenir and Aynur Akkuş Department of Computer Engineering and Information Science Bilkent University, 06533, Ankara, Turkey

More information

Double Sort Algorithm Resulting in Reference Set of the Desired Size

Double Sort Algorithm Resulting in Reference Set of the Desired Size Biocybernetics and Biomedical Engineering 2008, Volume 28, Number 4, pp. 43 50 Double Sort Algorithm Resulting in Reference Set of the Desired Size MARCIN RANISZEWSKI* Technical University of Łódź, Computer

More information

Constructing X-of-N Attributes with a Genetic Algorithm

Constructing X-of-N Attributes with a Genetic Algorithm Constructing X-of-N Attributes with a Genetic Algorithm Otavio Larsen 1 Alex Freitas 2 Julio C. Nievola 1 1 Postgraduate Program in Applied Computer Science 2 Computing Laboratory Pontificia Universidade

More information

A Distance-Based Classifier Using Dissimilarity Based on Class Conditional Probability and Within-Class Variation. Kwanyong Lee 1 and Hyeyoung Park 2

A Distance-Based Classifier Using Dissimilarity Based on Class Conditional Probability and Within-Class Variation. Kwanyong Lee 1 and Hyeyoung Park 2 A Distance-Based Classifier Using Dissimilarity Based on Class Conditional Probability and Within-Class Variation Kwanyong Lee 1 and Hyeyoung Park 2 1. Department of Computer Science, Korea National Open

More information

Discrete Particle Swarm Optimization With Local Search Strategy for Rule Classification

Discrete Particle Swarm Optimization With Local Search Strategy for Rule Classification Discrete Particle Swarm Optimization With Local Search Strategy for Rule Classification Min Chen and Simone A. Ludwig Department of Computer Science North Dakota State University Fargo, ND, USA min.chen@my.ndsu.edu,

More information

Fig. 1): The rule creation algorithm creates an initial fuzzy partitioning for each variable. This is given by a xed number of equally distributed tri

Fig. 1): The rule creation algorithm creates an initial fuzzy partitioning for each variable. This is given by a xed number of equally distributed tri Some Approaches to Improve the Interpretability of Neuro-Fuzzy Classiers Aljoscha Klose, Andreas Nurnberger, and Detlef Nauck Faculty of Computer Science (FIN-IWS), University of Magdeburg Universitatsplatz

More information

Use of Multi-category Proximal SVM for Data Set Reduction

Use of Multi-category Proximal SVM for Data Set Reduction Use of Multi-category Proximal SVM for Data Set Reduction S.V.N Vishwanathan and M Narasimha Murty Department of Computer Science and Automation, Indian Institute of Science, Bangalore 560 012, India Abstract.

More information

An Approach to Identify the Number of Clusters

An Approach to Identify the Number of Clusters An Approach to Identify the Number of Clusters Katelyn Gao Heather Hardeman Edward Lim Cristian Potter Carl Meyer Ralph Abbey July 11, 212 Abstract In this technological age, vast amounts of data are generated.

More information

Review on Data Mining Techniques for Intrusion Detection System

Review on Data Mining Techniques for Intrusion Detection System Review on Data Mining Techniques for Intrusion Detection System Sandeep D 1, M. S. Chaudhari 2 Research Scholar, Dept. of Computer Science, P.B.C.E, Nagpur, India 1 HoD, Dept. of Computer Science, P.B.C.E,

More information

Evolution of the Discrete Cosine Transform Using Genetic Programming

Evolution of the Discrete Cosine Transform Using Genetic Programming Res. Lett. Inf. Math. Sci. (22), 3, 117-125 Available online at http://www.massey.ac.nz/~wwiims/research/letters/ Evolution of the Discrete Cosine Transform Using Genetic Programming Xiang Biao Cui and

More information

Ordering attributes for missing values prediction and data classification

Ordering attributes for missing values prediction and data classification Ordering attributes for missing values prediction and data classification E. R. Hruschka Jr., N. F. F. Ebecken COPPE /Federal University of Rio de Janeiro, Brazil. Abstract This work shows the application

More information

A Structural Optimization Method of Genetic Network Programming. for Enhancing Generalization Ability

A Structural Optimization Method of Genetic Network Programming. for Enhancing Generalization Ability International Journal of Engineering Innovation and Management Vol.6, No.2, 2016 A Structural Optimization Method of Genetic Network Programming for Enhancing Generalization Ability Shingo Mabu, Yamaguchi

More information

Multiple Classifier Fusion using k-nearest Localized Templates

Multiple Classifier Fusion using k-nearest Localized Templates Multiple Classifier Fusion using k-nearest Localized Templates Jun-Ki Min and Sung-Bae Cho Department of Computer Science, Yonsei University Biometrics Engineering Research Center 134 Shinchon-dong, Sudaemoon-ku,

More information

Concept Tree Based Clustering Visualization with Shaded Similarity Matrices

Concept Tree Based Clustering Visualization with Shaded Similarity Matrices Syracuse University SURFACE School of Information Studies: Faculty Scholarship School of Information Studies (ischool) 12-2002 Concept Tree Based Clustering Visualization with Shaded Similarity Matrices

More information

Evolution of Classification Rules for Comprehensible Knowledge Discovery

Evolution of Classification Rules for Comprehensible Knowledge Discovery Evolution of Classification Rules for Comprehensible Knowledge Discovery Emiliano Carreño, Guillermo Leguizamón, Neal Wagner Member, IEEE Abstract This article, which lies within the data mining framework,

More information

CROSS-CORRELATION NEURAL NETWORK: A NEW NEURAL NETWORK CLASSIFIER

CROSS-CORRELATION NEURAL NETWORK: A NEW NEURAL NETWORK CLASSIFIER CROSS-CORRELATION NEURAL NETWORK: A NEW NEURAL NETWORK CLASSIFIER ARIT THAMMANO* AND NARODOM KLOMIAM** Faculty of Information Technology King Mongkut s Institute of Technology Ladkrang, Bangkok, 10520

More information

A Fuzzy Classifier based on Product and Sum Aggregation Reasoning Rule

A Fuzzy Classifier based on Product and Sum Aggregation Reasoning Rule A Fuzzy Classifier based on Product and Sum Aggregation Reasoning Rule U. V. Kulkarni, PhD. Professor, SGGS IOT, anded S. V. Shinde Associate Professor, PCCOE, Pune ABSTRACT This paper proposes the algorithm

More information

CISC 4631 Data Mining

CISC 4631 Data Mining CISC 4631 Data Mining Lecture 03: Introduction to classification Linear classifier Theses slides are based on the slides by Tan, Steinbach and Kumar (textbook authors) Eamonn Koegh (UC Riverside) 1 Classification:

More information

Design of an Optimal Nearest Neighbor Classifier Using an Intelligent Genetic Algorithm

Design of an Optimal Nearest Neighbor Classifier Using an Intelligent Genetic Algorithm Design of an Optimal Nearest Neighbor Classifier Using an Intelligent Genetic Algorithm Shinn-Ying Ho *, Chia-Cheng Liu, Soundy Liu, and Jun-Wen Jou Department of Information Engineering, Feng Chia University,

More information

Application of Genetic Algorithms to CFD. Cameron McCartney

Application of Genetic Algorithms to CFD. Cameron McCartney Application of Genetic Algorithms to CFD Cameron McCartney Introduction define and describe genetic algorithms (GAs) and genetic programming (GP) propose possible applications of GA/GP to CFD Application

More information

Genetic Image Network for Image Classification

Genetic Image Network for Image Classification Genetic Image Network for Image Classification Shinichi Shirakawa, Shiro Nakayama, and Tomoharu Nagao Graduate School of Environment and Information Sciences, Yokohama National University, 79-7, Tokiwadai,

More information

CS 584 Data Mining. Classification 1

CS 584 Data Mining. Classification 1 CS 584 Data Mining Classification 1 Classification: Definition Given a collection of records (training set ) Each record contains a set of attributes, one of the attributes is the class. Find a model for

More information

Learning Geometric Concepts with an Evolutionary Algorithm

Learning Geometric Concepts with an Evolutionary Algorithm final version in: Lawrence J. Fogel, Peter J. Angeline and Thomas Bck (Ed.), Evolutionary Programming V, The MIT Press, 1996 Learning Geometric Concepts with an Evolutionary Algorithm Andreas Birk Universität

More information

Performance Measure of Hard c-means,fuzzy c-means and Alternative c-means Algorithms

Performance Measure of Hard c-means,fuzzy c-means and Alternative c-means Algorithms Performance Measure of Hard c-means,fuzzy c-means and Alternative c-means Algorithms Binoda Nand Prasad*, Mohit Rathore**, Geeta Gupta***, Tarandeep Singh**** *Guru Gobind Singh Indraprastha University,

More information

Coevolving Functions in Genetic Programming: Classification using K-nearest-neighbour

Coevolving Functions in Genetic Programming: Classification using K-nearest-neighbour Coevolving Functions in Genetic Programming: Classification using K-nearest-neighbour Manu Ahluwalia Intelligent Computer Systems Centre Faculty of Computer Studies and Mathematics University of the West

More information

SIMILARITY MEASURES FOR MULTI-VALUED ATTRIBUTES FOR DATABASE CLUSTERING

SIMILARITY MEASURES FOR MULTI-VALUED ATTRIBUTES FOR DATABASE CLUSTERING SIMILARITY MEASURES FOR MULTI-VALUED ATTRIBUTES FOR DATABASE CLUSTERING TAE-WAN RYU AND CHRISTOPH F. EICK Department of Computer Science, University of Houston, Houston, Texas 77204-3475 {twryu, ceick}@cs.uh.edu

More information

Monika Maharishi Dayanand University Rohtak

Monika Maharishi Dayanand University Rohtak Performance enhancement for Text Data Mining using k means clustering based genetic optimization (KMGO) Monika Maharishi Dayanand University Rohtak ABSTRACT For discovering hidden patterns and structures

More information

Comparative Study on VQ with Simple GA and Ordain GA

Comparative Study on VQ with Simple GA and Ordain GA Proceedings of the 9th WSEAS International Conference on Automatic Control, Modeling & Simulation, Istanbul, Turkey, May 27-29, 2007 204 Comparative Study on VQ with Simple GA and Ordain GA SADAF SAJJAD

More information

Univariate Margin Tree

Univariate Margin Tree Univariate Margin Tree Olcay Taner Yıldız Department of Computer Engineering, Işık University, TR-34980, Şile, Istanbul, Turkey, olcaytaner@isikun.edu.tr Abstract. In many pattern recognition applications,

More information

Learning from hotlists and coldlists: Towards a WWW information filtering and seeking agent

Learning from hotlists and coldlists: Towards a WWW information filtering and seeking agent TAI Coldlist final Page 1 of 7 Learning from hotlists and coldlists: Towards a WWW information filtering and seeking agent Abstract Michael Pazzani, Larry Nguyen & Stefanus Mantik Department of Information

More information

School of Mathematics, Statistics and Computer Science. Computer Science. Object Detection using Neural Networks and Genetic Programming

School of Mathematics, Statistics and Computer Science. Computer Science. Object Detection using Neural Networks and Genetic Programming T E W H A R E W Ā N A N G A O T E Ū P O K O O T E I K A A M Ā U I ÎÍÏ V I C T O R I A UNIVERSITY OF WELLINGTON School of Mathematics, Statistics and Computer Science Computer Science Object Detection using

More information

A Genetic k-modes Algorithm for Clustering Categorical Data

A Genetic k-modes Algorithm for Clustering Categorical Data A Genetic k-modes Algorithm for Clustering Categorical Data Guojun Gan, Zijiang Yang, and Jianhong Wu Department of Mathematics and Statistics, York University, Toronto, Ontario, Canada M3J 1P3 {gjgan,

More information

Evolving Fuzzy Rules for Pattern Classification

Evolving Fuzzy Rules for Pattern Classification Evolving Fuzzy Rules for Pattern Classification Hugh Mallinson and Peter Bentley Department of Computer Science, University College London, Gower Street, London WC1E 6BT, UK. P.Bentley@cs.ucl.ac.uk, H.Mallinson@cs.ucl.ac.uk

More information

Image retrieval based on bag of images

Image retrieval based on bag of images University of Wollongong Research Online Faculty of Informatics - Papers (Archive) Faculty of Engineering and Information Sciences 2009 Image retrieval based on bag of images Jun Zhang University of Wollongong

More information

RECORD DEDUPLICATION USING GENETIC PROGRAMMING APPROACH

RECORD DEDUPLICATION USING GENETIC PROGRAMMING APPROACH Int. J. Engg. Res. & Sci. & Tech. 2013 V Karthika et al., 2013 Research Paper ISSN 2319-5991 www.ijerst.com Vol. 2, No. 2, May 2013 2013 IJERST. All Rights Reserved RECORD DEDUPLICATION USING GENETIC PROGRAMMING

More information

Classification of Concept-Drifting Data Streams using Optimized Genetic Algorithm

Classification of Concept-Drifting Data Streams using Optimized Genetic Algorithm Classification of Concept-Drifting Data Streams using Optimized Genetic Algorithm E. Padmalatha Asst.prof CBIT C.R.K. Reddy, PhD Professor CBIT B. Padmaja Rani, PhD Professor JNTUH ABSTRACT Data Stream

More information

Statistical Methods in AI

Statistical Methods in AI Statistical Methods in AI Distance Based and Linear Classifiers Shrenik Lad, 200901097 INTRODUCTION : The aim of the project was to understand different types of classification algorithms by implementing

More information

Improvement of Web Search Results using Genetic Algorithm on Word Sense Disambiguation

Improvement of Web Search Results using Genetic Algorithm on Word Sense Disambiguation Volume 3, No.5, May 24 International Journal of Advances in Computer Science and Technology Pooja Bassin et al., International Journal of Advances in Computer Science and Technology, 3(5), May 24, 33-336

More information

FEATURE EXTRACTION TECHNIQUES USING SUPPORT VECTOR MACHINES IN DISEASE PREDICTION

FEATURE EXTRACTION TECHNIQUES USING SUPPORT VECTOR MACHINES IN DISEASE PREDICTION FEATURE EXTRACTION TECHNIQUES USING SUPPORT VECTOR MACHINES IN DISEASE PREDICTION Sandeep Kaur 1, Dr. Sheetal Kalra 2 1,2 Computer Science Department, Guru Nanak Dev University RC, Jalandhar(India) ABSTRACT

More information

Mining Class Contrast Functions by Gene Expression Programming 1

Mining Class Contrast Functions by Gene Expression Programming 1 Mining Class Contrast Functions by Gene Expression Programming 1 Lei Duan, Changjie Tang, Liang Tang, Tianqing Zhang and Jie Zuo School of Computer Science, Sichuan University, Chengdu 610065, China {leiduan,

More information

Investigating the Application of Genetic Programming to Function Approximation

Investigating the Application of Genetic Programming to Function Approximation Investigating the Application of Genetic Programming to Function Approximation Jeremy E. Emch Computer Science Dept. Penn State University University Park, PA 16802 Abstract When analyzing a data set it

More information

LEARNING WEIGHTS OF FUZZY RULES BY USING GRAVITATIONAL SEARCH ALGORITHM

LEARNING WEIGHTS OF FUZZY RULES BY USING GRAVITATIONAL SEARCH ALGORITHM International Journal of Innovative Computing, Information and Control ICIC International c 2013 ISSN 1349-4198 Volume 9, Number 4, April 2013 pp. 1593 1601 LEARNING WEIGHTS OF FUZZY RULES BY USING GRAVITATIONAL

More information

Estimating Error-Dimensionality Relationship for Gene Expression Based Cancer Classification

Estimating Error-Dimensionality Relationship for Gene Expression Based Cancer Classification 1 Estimating Error-Dimensionality Relationship for Gene Expression Based Cancer Classification Feng Chu and Lipo Wang School of Electrical and Electronic Engineering Nanyang Technological niversity Singapore

More information

Research on Applications of Data Mining in Electronic Commerce. Xiuping YANG 1, a

Research on Applications of Data Mining in Electronic Commerce. Xiuping YANG 1, a International Conference on Education Technology, Management and Humanities Science (ETMHS 2015) Research on Applications of Data Mining in Electronic Commerce Xiuping YANG 1, a 1 Computer Science Department,

More information

Multiobjective Association Rule Mining

Multiobjective Association Rule Mining Multiobjective Association Rule Mining Hisao Ishibuchi, Isao Kuwajima, and Yusuke Nojima Department of Computer Science and Intelligent Systems, Graduate School of Engineering, Osaka Prefecture University,

More information

Mass Classification Method in Mammogram Using Fuzzy K-Nearest Neighbour Equality

Mass Classification Method in Mammogram Using Fuzzy K-Nearest Neighbour Equality Mass Classification Method in Mammogram Using Fuzzy K-Nearest Neighbour Equality Abstract: Mass classification of objects is an important area of research and application in a variety of fields. In this

More information

Texture classification using fuzzy uncertainty texture spectrum

Texture classification using fuzzy uncertainty texture spectrum Neurocomputing 20 (1998) 115 122 Texture classification using fuzzy uncertainty texture spectrum Yih-Gong Lee*, Jia-Hong Lee, Yuang-Cheh Hsueh Department of Computer and Information Science, National Chiao

More information

Santa Fe Trail Problem Solution Using Grammatical Evolution

Santa Fe Trail Problem Solution Using Grammatical Evolution 2012 International Conference on Industrial and Intelligent Information (ICIII 2012) IPCSIT vol.31 (2012) (2012) IACSIT Press, Singapore Santa Fe Trail Problem Solution Using Grammatical Evolution Hideyuki

More information

A TWO-LEVEL COEVOLUTIONARY APPROACH TO MULTIDIMENSIONAL PATTERN CLASSIFICATION PROBLEMS. Ki-Kwang Lee and Wan Chul Yoon

A TWO-LEVEL COEVOLUTIONARY APPROACH TO MULTIDIMENSIONAL PATTERN CLASSIFICATION PROBLEMS. Ki-Kwang Lee and Wan Chul Yoon Proceedings of the 2005 International Conference on Simulation and Modeling V. Kachitvichyanukul, U. Purintrapiban, P. Utayopas, eds. A TWO-LEVEL COEVOLUTIONARY APPROACH TO MULTIDIMENSIONAL PATTERN CLASSIFICATION

More information

Application of Or-based Rule Antecedent Fuzzy Neural Networks to Iris Data Classification Problem

Application of Or-based Rule Antecedent Fuzzy Neural Networks to Iris Data Classification Problem Vol.1 (DTA 016, pp.17-1 http://dx.doi.org/10.157/astl.016.1.03 Application of Or-based Rule Antecedent Fuzzy eural etworks to Iris Data Classification roblem Chang-Wook Han Department of Electrical Engineering,

More information

Genetic-PSO Fuzzy Data Mining With Divide and Conquer Strategy

Genetic-PSO Fuzzy Data Mining With Divide and Conquer Strategy Genetic-PSO Fuzzy Data Mining With Divide and Conquer Strategy Amin Jourabloo Department of Computer Engineering, Sharif University of Technology, Tehran, Iran E-mail: jourabloo@ce.sharif.edu Abstract

More information

Introduction to Artificial Intelligence

Introduction to Artificial Intelligence Introduction to Artificial Intelligence COMP307 Evolutionary Computing 3: Genetic Programming for Regression and Classification Yi Mei yi.mei@ecs.vuw.ac.nz 1 Outline Statistical parameter regression Symbolic

More information

Fuzzy Ant Clustering by Centroid Positioning

Fuzzy Ant Clustering by Centroid Positioning Fuzzy Ant Clustering by Centroid Positioning Parag M. Kanade and Lawrence O. Hall Computer Science & Engineering Dept University of South Florida, Tampa FL 33620 @csee.usf.edu Abstract We

More information

2009 Inderscience Enterprises. Reprinted with permission.

2009 Inderscience Enterprises. Reprinted with permission. Xiaolei Wang, Xiao Zhi Gao, and Seppo J. Ovaska. 29. Fusion of clonal selection algorithm and harmony search method in optimisation of fuzzy classification systems. International Journal of Bio Inspired

More information

Inducing Parameters of a Decision Tree for Expert System Shell McESE by Genetic Algorithm

Inducing Parameters of a Decision Tree for Expert System Shell McESE by Genetic Algorithm Inducing Parameters of a Decision Tree for Expert System Shell McESE by Genetic Algorithm I. Bruha and F. Franek Dept of Computing & Software, McMaster University Hamilton, Ont., Canada, L8S4K1 Email:

More information

Feature Subset Selection Problem using Wrapper Approach in Supervised Learning

Feature Subset Selection Problem using Wrapper Approach in Supervised Learning Feature Subset Selection Problem using Wrapper Approach in Supervised Learning Asha Gowda Karegowda Dept. of Master of Computer Applications Technology Tumkur, Karnataka,India M.A.Jayaram Dept. of Master

More information

Codebook generation for Image Compression with Simple and Ordain GA

Codebook generation for Image Compression with Simple and Ordain GA Codebook generation for Image Compression with Simple and Ordain GA SAJJAD MOHSIN, SADAF SAJJAD COMSATS Institute of Information Technology Department of Computer Science Tobe Camp, Abbotabad PAKISTAN

More information

Segmentation of Noisy Binary Images Containing Circular and Elliptical Objects using Genetic Algorithms

Segmentation of Noisy Binary Images Containing Circular and Elliptical Objects using Genetic Algorithms Segmentation of Noisy Binary Images Containing Circular and Elliptical Objects using Genetic Algorithms B. D. Phulpagar Computer Engg. Dept. P. E. S. M. C. O. E., Pune, India. R. S. Bichkar Prof. ( Dept.

More information

MODEL FUZZY K-NEAREST NEIGHBOR WITH LOCAL MEAN FOR PATTERN RECOGNITION

MODEL FUZZY K-NEAREST NEIGHBOR WITH LOCAL MEAN FOR PATTERN RECOGNITION International Journal of Computer Engineering & Technology (IJCET) Volume 9, Issue 2, March-April 2018, pp. 162 168, Article ID: IJCET_09_02_017 Available online at http://www.iaeme.com/ijcet/issues.asp?jtype=ijcet&vtype=9&itype=2

More information

Discretizing Continuous Attributes Using Information Theory

Discretizing Continuous Attributes Using Information Theory Discretizing Continuous Attributes Using Information Theory Chang-Hwan Lee Department of Information and Communications, DongGuk University, Seoul, Korea 100-715 chlee@dgu.ac.kr Abstract. Many classification

More information

Uplift Modeling with ROC: An SRL Case Study

Uplift Modeling with ROC: An SRL Case Study Appears in the Proc. of International Conference on Inductive Logic Programming (ILP 13), Rio de Janeiro, Brazil, 2013. Uplift Modeling with ROC: An SRL Case Study Houssam Nassif, Finn Kuusisto, Elizabeth

More information

Hardware Neuronale Netzwerke - Lernen durch künstliche Evolution (?)

Hardware Neuronale Netzwerke - Lernen durch künstliche Evolution (?) SKIP - May 2004 Hardware Neuronale Netzwerke - Lernen durch künstliche Evolution (?) S. G. Hohmann, Electronic Vision(s), Kirchhoff Institut für Physik, Universität Heidelberg Hardware Neuronale Netzwerke

More information

Evolutionary Lossless Compression with GP-ZIP

Evolutionary Lossless Compression with GP-ZIP Evolutionary Lossless Compression with GP-ZIP Ahmad Kattan and Riccardo Poli Abstract In this paper we propose a new approach for applying Genetic Programming to lossless data compression based on combining

More information

Improved DAG SVM: A New Method for Multi-Class SVM Classification

Improved DAG SVM: A New Method for Multi-Class SVM Classification 548 Int'l Conf. Artificial Intelligence ICAI'09 Improved DAG SVM: A New Method for Multi-Class SVM Classification Mostafa Sabzekar, Mohammad GhasemiGol, Mahmoud Naghibzadeh, Hadi Sadoghi Yazdi Department

More information

Machine Learning nearest neighbors classification. Luigi Cerulo Department of Science and Technology University of Sannio

Machine Learning nearest neighbors classification. Luigi Cerulo Department of Science and Technology University of Sannio Machine Learning nearest neighbors classification Luigi Cerulo Department of Science and Technology University of Sannio Nearest Neighbors Classification The idea is based on the hypothesis that things

More information

Genetic programming. Lecture Genetic Programming. LISP as a GP language. LISP structure. S-expressions

Genetic programming. Lecture Genetic Programming. LISP as a GP language. LISP structure. S-expressions Genetic programming Lecture Genetic Programming CIS 412 Artificial Intelligence Umass, Dartmouth One of the central problems in computer science is how to make computers solve problems without being explicitly

More information

Training Algorithms for Robust Face Recognition using a Template-matching Approach

Training Algorithms for Robust Face Recognition using a Template-matching Approach Training Algorithms for Robust Face Recognition using a Template-matching Approach Xiaoyan Mu, Mehmet Artiklar, Metin Artiklar, and Mohamad H. Hassoun Department of Electrical and Computer Engineering

More information

Image Mining: frameworks and techniques

Image Mining: frameworks and techniques Image Mining: frameworks and techniques Madhumathi.k 1, Dr.Antony Selvadoss Thanamani 2 M.Phil, Department of computer science, NGM College, Pollachi, Coimbatore, India 1 HOD Department of Computer Science,

More information

Data Cleaning and Prototyping Using K-Means to Enhance Classification Accuracy

Data Cleaning and Prototyping Using K-Means to Enhance Classification Accuracy Data Cleaning and Prototyping Using K-Means to Enhance Classification Accuracy Lutfi Fanani 1 and Nurizal Dwi Priandani 2 1 Department of Computer Science, Brawijaya University, Malang, Indonesia. 2 Department

More information

Clustering Analysis of Simple K Means Algorithm for Various Data Sets in Function Optimization Problem (Fop) of Evolutionary Programming

Clustering Analysis of Simple K Means Algorithm for Various Data Sets in Function Optimization Problem (Fop) of Evolutionary Programming Clustering Analysis of Simple K Means Algorithm for Various Data Sets in Function Optimization Problem (Fop) of Evolutionary Programming R. Karthick 1, Dr. Malathi.A 2 Research Scholar, Department of Computer

More information

Genetic Fourier Descriptor for the Detection of Rotational Symmetry

Genetic Fourier Descriptor for the Detection of Rotational Symmetry 1 Genetic Fourier Descriptor for the Detection of Rotational Symmetry Raymond K. K. Yip Department of Information and Applied Technology, Hong Kong Institute of Education 10 Lo Ping Road, Tai Po, New Territories,

More information