A Modified Fuzzy Min-Max Neural Network and Its Application to Fault Classification
|
|
- Brent Barker
- 6 years ago
- Views:
Transcription
1 A Modified Fuzzy Min-Max Neural Network and Its Application to Fault Classification Anas M. Quteishat and Chee Peng Lim School of Electrical & Electronic Engineering University of Science Malaysia
2 Abstract The objectives of this paper are: To improve the Fuzzy Min-Max (FMM) classification performance in situations when large hyperboxes are formed by the network. The Euclidean distance is computed after network training, and both the membership value of the hyperbox fuzzy sets and the Euclidean distance are used for classification. To assess the effectiveness of the modified FMM network. Benchmark pattern classification problems are first used, and the results from different methods are compared. In addition, a fault classification problem with real sensor measurements collected from a power generation plant is used to evaluate the applicability of the modified FMM network.
3 Introduction There are two types of learning classifiers: supervised and unsupervised learning, which differ in the way they are trained. The FMM neural network is a pattern classification system that can be used for tackling clustering (unsupervised) or classification (supervised) problems. In this paper FMM is used as a supervised classification system.
4 Introduction (contd.) FMM is constructed using hyperbox fuzzy sets, each of which is an n-dimensional box defined by a set of minimum and maximum points. Each input pattern is classified based on the degree of membership to the corresponding boxes. A smaller hyperbox size means that the hyperbox can contain only a smaller number of patterns, which will increase the network complexity, but yet gives high accuracy. A larger hyperbox size means that the hyperbox can contain a larger number of patterns, and will decrease the network complexity, but this leads to low classification accuracy.
5 FMM Neural Network The FMM classification network is formed using hyperbox fuzzy sets. A hyperbox defines a region of the n-dimensional pattern space that has patterns with full class membership. The hyperbox is completely defined by its minimum and maximum points. min point max point Fig. 1. A min-max hyperbox
6 The definition of each hyperbox fuzzy set B j is { XV,, W, f ( XV, W )} B, n j = j j j j X I (1) Where, X is the input, I n is a unit cube with n dimensions and V j and W j are the min and max points, respectively The size of the hyperboxes θ can have a value between 0 and 1. A small value of θ will produce small-size hyperboxes, and vice versa.
7 The input patterns are classified depending on how much they are contained by a hyperbox, this is measured using the membership value of a pattern using: b j n 1 ( Ah ) = [max γ 2n i= 1 ( 0,1 max ( 0, min ( 1, a w ) ( 0,1 max ( 0, min ( 1, v a ))] + max γ (2) A n ( a 1, a 2 a ) I ( ) =,..., where, h h h hn is the h th input pattern, V j = v j1, v j 2,..., v jn is the min point for B j, W j = ( w j1, w j2,..., w jn ) is the max point for B j, and γ is the sensitivity parameter. hi ji ji hi
8 Modifications of the FMM Network The modification done on the FMM neural network was done on the prediction stage. The original learning FMM procedure was not touched. In this paper we propose two methods for prediction using: 1. Euclidean distance 2. Both Euclidean distance and membership value
9 1) Prediction using the Euclidean distance This prediction method is based on the Euclidean distance between the input pattern and the centroid of the hyperbox. In addition to the min and max points, the centroid of patterns falling in each hyperbox is computed, as follows: C ji a hi C ji = C ji + (3) N j where C ji is the centroid of the j th hyperbox in the i th dimension, and N j is the number of patterns included in the j th hyperbox.
10 The Euclidean distance between the centroid and the input pattern is calculated using: E jh = n ( C ji a hi ) i= 1 2 (4) Where, E jh is the Euclidean distance between j th hyperbox and the h th input pattern. In the classification process, the hyperbox with the smallest Euclidean distance to the input pattern is selected as the winner, and the pattern is so classified that it belongs to this hyperbox.
11 Figure 2 shows the classification process of a twodimensional input pattern using the described method. Centroid of hyperbox 2 C 2 w 2 E 2 E 1 w 1 Centroid of hyperbox 1 C 1 Input v 2 Class 2 v 1 Class 1 Fig. 2. The classification process of an input pattern using the Euclidean distance
12 2) Prediction using both the membership function and Euclidean distance When θ is large, hyperbox sizes are large, as consequent when calculating the membership value more than one hyperbox will have very large membership value sometimes even unity value. To solve this problem we propose to use both the membership value and Euclidean distance for classification. The hyperboxes with the highest membership value are selected and then the Euclidean distance between the centroid of these boxes and the input pattern is calculated. The hyperbox with the smallest distance is used to classify the input pattern.
13 Experiments and results: The proposed methods were tested using 4 data sets: A) Bench mark data sets: 1) PID data set 2) IRIS data set B) Fault diagnosis data sets: 1) The heat transfer conditions. 2) The tube blockage conditions.
14 A) Bench mark data sets: 1) PID data set Pima Indian Diabetes (PID) data set consists of 768 samples in a two class problem. The set was divided into two sets, 75% for training and 25% for testing. The experiment was done 5 times and the average of the results is shown in Figure 3. A comparison between the results obtained by our method and other methods based on the same experimental criteria is as shown in Table 1.
15 Testing accuracy (%) A B C theta Fig. 3. The testing accuracy rates of the PID problem. Curve A shows the accuracy rate using the membership value only; curve B shows the accuracy rate using both the membership value and Euclidean distance; curve C shows the accuracy rate using the Euclidean distance only.
16 Table 1. Classification accuracy from various methods for the PID data set methods Accuracy (%) LDA 77.5 C CART 72.8 K-NN 71.9 Curve A (best result) Curve B (best result) 72.4 Curve C (best result) 74.9
17 A) Bench mark data sets: 2) IRIS data set The IRIS data set consists 150 samples in a 3 class classification problem. The data set was divided into two sets; training set which consisted of 80% of each class, and a test set with the remaining samples. The experiment was conducted for the data set, and the results of the proposed methods along with the original FMM are shown in Figure 4. Table 2 shows the maximum accuracy of various methods in comparison with the proposed methods (all the experiments are conducted using the same training and test data sets).
18 96 Testing Accuracy % A B C Theta Fig. 4. The testing accuracy rates of the IRIS data set. Curve A shows the accuracy rate using the membership value only; curve B shows the accuracy rate using both the membership value and Euclidean distance; curve C shows the accuracy rate using the Euclidean distance only.
19 Table 2. Percentage error for various methods for the IRIS data set methods Accuracy (%) C OC LMDT LVQ Curve A (best result) Curve B (best result) Curve C (best result) 94.00
20 B) Fault Classification A fault detection and classification system predicts failures, and when a failure occurs, it identifies the reason(s) of failures. In this study, we investigate the applicability of modified FMM using a set of sensor measurements collected from a power generation plant in Malaysia. The system under consideration is a circulating water (CW) system, as shown in Figure 5
21 Low Pressure Turbines Steam To Sea Condenser Primary Bar Screen CW Pumps Common Discharge Header Strainer Seawater Condensate (Reused in steam cycle process) Fig. 5. The circulating water system
22 A data set of 2439 samples was collected. Each data sample consisted of 12 features comprising of temperature and pressure measurements at various inlet and outlet points of the condenser, as well as other important parameters. Two case studies were conducted: 1. The heat transfer conditions. 2. The tube blockage conditions.
23 B) Fault Classification 1) Heat Transfer Conditions The heat transfer conditions were classified into two categories: efficient or inefficient. From the data set, 1224 data samples (50.18%) that showed inefficient heat transfer condition, whereas 1215 data samples (49.82%) showed efficient heat transfer condition in the condenser. The data set (excluding one sample) was divided into two equal sets, each containing 1219 samples, one for training and the other for testing. Both data sets contained 50% of the data samples belonging to each class.
24 Figure 6 shows the testing accuracy of the proposed methods along with original FMM. Table 3 shows the testing accuracy along with the number of hyperboxes used in the classification.
25 Testing accuracy (%) A B C theta Fig. 6. The testing accuracy rates of the Heat Transfer Conditions. Curves A, B, and C show the testing accuracy rates using the membership value only, both the membership value and Euclidean distance, and the Euclidean distance only, respectively.
26 Table 3. Testing accuracy for the heat transfer data set Theta θ Membership value (%) Euclidean distance (%) Membership value and Euclidean distance (%) Number of hyperboxes
27 B) Fault Classification 2) Tube Blockage Conditions In this experiment, the objective was to predict the occurrence of tube blockage in the CW system. The conditions of the condenser tubes were categorized into two classes: significant blockage and insignificant blockage. The data set used in the previous experiment was again employed. A total of 1313 samples (53.83%) showed significant blockage and the remaining samples showed insignificant blockage in the condenser tubes. The data samples were divided into two sets for training and testing. Again, the data set (excluding one sample) was divided into two equal sets, each containing 1219 samples, one for training and the other for testing.
28 Figure 7 shows the testing accuracy of the proposed methods along with original FMM. Table 4 shows the testing accuracy along with the number of hyperboxes used in the classification.
29 testing accuracy (%) A B C theta Fig. 7. The testing accuracy rates of the Tube Blockage Conditions. Curves A, B, and C show the testing accuracy rates using the membership value only, both the membership value and Euclidean distance, and the Euclidean distance only, respectively.
30 Table 5. Testing accuracy for the block detection data set Theta θ Membership value (%) Euclidean distance (%) Membership value and Euclidean distance (%) Number of hyperboxes
31 Conclusions The results obtained reveal the usefulness of the proposed modifications in improving the performance of FMM when large hyperboxes are formed by the network.
Modified Fuzzy Hyperline Segment Neural Network for Pattern Classification and Recognition
, July 2-4, 2014, London, U.K. Modified Fuzzy Hyperline Segment Neural Network for Pattern Classification and Recognition S. B. Bagal and U. V. Kulkarni Abstract The Fuzzy Hyperline Segment Neural Network
More informationIntroduction to Artificial Intelligence
Introduction to Artificial Intelligence COMP307 Machine Learning 2: 3-K Techniques Yi Mei yi.mei@ecs.vuw.ac.nz 1 Outline K-Nearest Neighbour method Classification (Supervised learning) Basic NN (1-NN)
More informationA Fuzzy C-means Clustering Algorithm Based on Pseudo-nearest-neighbor Intervals for Incomplete Data
Journal of Computational Information Systems 11: 6 (2015) 2139 2146 Available at http://www.jofcis.com A Fuzzy C-means Clustering Algorithm Based on Pseudo-nearest-neighbor Intervals for Incomplete Data
More informationAPPLICATION OF THE FUZZY MIN-MAX NEURAL NETWORK CLASSIFIER TO PROBLEMS WITH CONTINUOUS AND DISCRETE ATTRIBUTES
APPLICATION OF THE FUZZY MIN-MAX NEURAL NETWORK CLASSIFIER TO PROBLEMS WITH CONTINUOUS AND DISCRETE ATTRIBUTES A. Likas, K. Blekas and A. Stafylopatis National Technical University of Athens Department
More informationAn Empirical Comparison of Ensemble Methods Based on Classification Trees. Mounir Hamza and Denis Larocque. Department of Quantitative Methods
An Empirical Comparison of Ensemble Methods Based on Classification Trees Mounir Hamza and Denis Larocque Department of Quantitative Methods HEC Montreal Canada Mounir Hamza and Denis Larocque 1 June 2005
More informationT-S Neural Network Model Identification of Ultra-Supercritical Units for Superheater Based on Improved FCM
Research Journal of Applied Sciences, Engineering and echnology 4(4): 247-252, 202 ISSN: 2040-7467 Maxwell Scientific Organization, 202 Submitted: March 2, 202 Accepted: April 03, 202 Published: July 5,
More informationLABVIEW APPLICATION: ENERGY LABORATORY UPGRADE
Session 3233 LABVIEW APPLICATION: ENERGY LABORATORY UPGRADE J. Howard Arthur Michael R. Sexton Mechanical Engineering Department Virginia Military Institute Lexington, VA 24450 Abstract This paper describes
More informationUnsupervised Learning : Clustering
Unsupervised Learning : Clustering Things to be Addressed Traditional Learning Models. Cluster Analysis K-means Clustering Algorithm Drawbacks of traditional clustering algorithms. Clustering as a complex
More informationCHAPTER 6 MODIFIED FUZZY TECHNIQUES BASED IMAGE SEGMENTATION
CHAPTER 6 MODIFIED FUZZY TECHNIQUES BASED IMAGE SEGMENTATION 6.1 INTRODUCTION Fuzzy logic based computational techniques are becoming increasingly important in the medical image analysis arena. The significant
More informationAssociative Cellular Learning Automata and its Applications
Associative Cellular Learning Automata and its Applications Meysam Ahangaran and Nasrin Taghizadeh and Hamid Beigy Department of Computer Engineering, Sharif University of Technology, Tehran, Iran ahangaran@iust.ac.ir,
More informationCHAPTER 3 MODELING OF DEAERATOR AND SIMULATION OF FAULTS
27 CHAPTER 3 MODELING OF DEAERATOR AND SIMULATION OF FAULTS 3.1 INTRODUCTION Modeling plays an important role in the prediction and assessment of plant performance. There are two ways of getting the model
More informationMODELING FOR RESIDUAL STRESS, SURFACE ROUGHNESS AND TOOL WEAR USING AN ADAPTIVE NEURO FUZZY INFERENCE SYSTEM
CHAPTER-7 MODELING FOR RESIDUAL STRESS, SURFACE ROUGHNESS AND TOOL WEAR USING AN ADAPTIVE NEURO FUZZY INFERENCE SYSTEM 7.1 Introduction To improve the overall efficiency of turning, it is necessary to
More informationCluster Analysis: Agglomerate Hierarchical Clustering
Cluster Analysis: Agglomerate Hierarchical Clustering Yonghee Lee Department of Statistics, The University of Seoul Oct 29, 2015 Contents 1 Cluster Analysis Introduction Distance matrix Agglomerative Hierarchical
More informationPattern Classification Algorithms for Face Recognition
Chapter 7 Pattern Classification Algorithms for Face Recognition 7.1 Introduction The best pattern recognizers in most instances are human beings. Yet we do not completely understand how the brain recognize
More informationEVALUATION OF CONVENTIONAL DIGITAL CAMERA SCENES FOR THEMATIC INFORMATION EXTRACTION ABSTRACT
EVALUATION OF CONVENTIONAL DIGITAL CAMERA SCENES FOR THEMATIC INFORMATION EXTRACTION H. S. Lim, M. Z. MatJafri and K. Abdullah School of Physics Universiti Sains Malaysia, 11800 Penang ABSTRACT A study
More informationINF4820 Algorithms for AI and NLP. Evaluating Classifiers Clustering
INF4820 Algorithms for AI and NLP Evaluating Classifiers Clustering Murhaf Fares & Stephan Oepen Language Technology Group (LTG) September 27, 2017 Today 2 Recap Evaluation of classifiers Unsupervised
More informationUnsupervised Learning
Unsupervised Learning Unsupervised learning Until now, we have assumed our training samples are labeled by their category membership. Methods that use labeled samples are said to be supervised. However,
More informationK-means clustering Based in part on slides from textbook, slides of Susan Holmes. December 2, Statistics 202: Data Mining.
K-means clustering Based in part on slides from textbook, slides of Susan Holmes December 2, 2012 1 / 1 K-means Outline K-means, K-medoids Choosing the number of clusters: Gap test, silhouette plot. Mixture
More informationSpatial Information Based Image Classification Using Support Vector Machine
Spatial Information Based Image Classification Using Support Vector Machine P.Jeevitha, Dr. P. Ganesh Kumar PG Scholar, Dept of IT, Regional Centre of Anna University, Coimbatore, India. Assistant Professor,
More informationList of forms CLAP/WGP by subject /68/UE
List of forms CLAP/WGP by subject - 2014/68/UE Subject Assembly - Assembly including an item of pressure equipment placed on the market before 29 May 2002 in compliance with national pre-ped regulations
More informationThe k-means Algorithm and Genetic Algorithm
The k-means Algorithm and Genetic Algorithm k-means algorithm Genetic algorithm Rough set approach Fuzzy set approaches Chapter 8 2 The K-Means Algorithm The K-Means algorithm is a simple yet effective
More informationHeart Disease Detection using EKSTRAP Clustering with Statistical and Distance based Classifiers
IOSR Journal of Computer Engineering (IOSR-JCE) e-issn: 2278-0661,p-ISSN: 2278-8727, Volume 18, Issue 3, Ver. IV (May-Jun. 2016), PP 87-91 www.iosrjournals.org Heart Disease Detection using EKSTRAP Clustering
More informationEE 589 INTRODUCTION TO ARTIFICIAL NETWORK REPORT OF THE TERM PROJECT REAL TIME ODOR RECOGNATION SYSTEM FATMA ÖZYURT SANCAR
EE 589 INTRODUCTION TO ARTIFICIAL NETWORK REPORT OF THE TERM PROJECT REAL TIME ODOR RECOGNATION SYSTEM FATMA ÖZYURT SANCAR 1.Introductıon. 2.Multi Layer Perception.. 3.Fuzzy C-Means Clustering.. 4.Real
More informationObject Recognition Using Reflex Fuzzy Min-Max Neural Network with Floating Neurons
Object Recognition Using Reflex Fuzzy Min-Max Neural Network with Floating Neurons A.V. Nandedkar and P.K. Biswas Electronics & Elec. Communication Engg. Department, Indian Institute of Technology, Kharagpur
More informationModeling of Compressors and Expansion Devices With Two-Phase Refrigerant Inlet Conditions
Purdue University Purdue e-pubs International Refrigeration and Air Conditioning Conference School of Mechanical Engineering 2006 Modeling of Compressors and Expansion Devices With Two-Phase Refrigerant
More informationUsing a genetic algorithm for editing k-nearest neighbor classifiers
Using a genetic algorithm for editing k-nearest neighbor classifiers R. Gil-Pita 1 and X. Yao 23 1 Teoría de la Señal y Comunicaciones, Universidad de Alcalá, Madrid (SPAIN) 2 Computer Sciences Department,
More informationA comparative study of local classifiers based on clustering techniques and one-layer neural networks
A comparative study of local classifiers based on clustering techniques and one-layer neural networks Yuridia Gago-Pallares, Oscar Fontenla-Romero and Amparo Alonso-Betanzos University of A Coruña, Department
More informationClustering & Classification (chapter 15)
Clustering & Classification (chapter 5) Kai Goebel Bill Cheetham RPI/GE Global Research goebel@cs.rpi.edu cheetham@cs.rpi.edu Outline k-means Fuzzy c-means Mountain Clustering knn Fuzzy knn Hierarchical
More informationFUZZY KERNEL K-MEDOIDS ALGORITHM FOR MULTICLASS MULTIDIMENSIONAL DATA CLASSIFICATION
FUZZY KERNEL K-MEDOIDS ALGORITHM FOR MULTICLASS MULTIDIMENSIONAL DATA CLASSIFICATION 1 ZUHERMAN RUSTAM, 2 AINI SURI TALITA 1 Senior Lecturer, Department of Mathematics, Faculty of Mathematics and Natural
More informationCOMP33111: Tutorial and lab exercise 7
COMP33111: Tutorial and lab exercise 7 Guide answers for Part 1: Understanding clustering 1. Explain the main differences between classification and clustering. main differences should include being unsupervised
More informationEarly tube leak detection system for steam boiler at KEV power plant
Early tube leak detection system for steam boiler at KEV power plant Firas B. Ismail 1a,, Deshvin Singh 1, N. Maisurah 1 and Abu Bakar B. Musa 1 1 Power Generation Research Centre, College of Engineering,
More informationDesigning Ultra-Efficient Building Cooling Systems
Designing Ultra-Efficient Building Cooling Systems Design to Apply Effective Optimization Technologies To Chiller Plants and Air Systems ASHRAE Triangle Chapter Raleigh, NC November 16, 2016 Tom Hartman,
More informationUniversity of Florida CISE department Gator Engineering. Clustering Part 5
Clustering Part 5 Dr. Sanjay Ranka Professor Computer and Information Science and Engineering University of Florida, Gainesville SNN Approach to Clustering Ordinary distance measures have problems Euclidean
More informationData Mining Clustering
Data Mining Clustering Jingpeng Li 1 of 34 Supervised Learning F(x): true function (usually not known) D: training sample (x, F(x)) 57,M,195,0,125,95,39,25,0,1,0,0,0,1,0,0,0,0,0,0,1,1,0,0,0,0,0,0,0,0 0
More informationAn Empirical Study of Hoeffding Racing for Model Selection in k-nearest Neighbor Classification
An Empirical Study of Hoeffding Racing for Model Selection in k-nearest Neighbor Classification Flora Yu-Hui Yeh and Marcus Gallagher School of Information Technology and Electrical Engineering University
More informationCHAPTER 3 FUZZY RULE BASED MODEL FOR FAULT DIAGNOSIS
39 CHAPTER 3 FUZZY RULE BASED MODEL FOR FAULT DIAGNOSIS 3.1 INTRODUCTION Development of mathematical models is essential for many disciplines of engineering and science. Mathematical models are used for
More informationAdvances in Military Technology Vol. 8, No. 2, December Applied Sensor Fault Detection, Identification and Data Reconstruction
AiMT Advances in Military Technology Vol. 8, No. 2, December 2013 Applied Sensor Fault Detection, Identification and Data Reconstruction Y. Zhang *, C.M. Bingham and M. Gallimore School of Engineering,
More informationAutomatic Generation of Fuzzy Classification Rules Using Granulation-Based Adaptive Clustering
Automatic Generation of Fuzzy Classification Rules Using Granulation-Based Adaptive Clustering Mohammed Al-Shammaa*, Maysam F. Abbod Department of Electronic and Computer Engineering Brunel University
More informationFigure (5) Kohonen Self-Organized Map
2- KOHONEN SELF-ORGANIZING MAPS (SOM) - The self-organizing neural networks assume a topological structure among the cluster units. - There are m cluster units, arranged in a one- or two-dimensional array;
More informationGene Clustering & Classification
BINF, Introduction to Computational Biology Gene Clustering & Classification Young-Rae Cho Associate Professor Department of Computer Science Baylor University Overview Introduction to Gene Clustering
More informationInternational Journal of Research in Advent Technology, Vol.7, No.3, March 2019 E-ISSN: Available online at
Performance Evaluation of Ensemble Method Based Outlier Detection Algorithm Priya. M 1, M. Karthikeyan 2 Department of Computer and Information Science, Annamalai University, Annamalai Nagar, Tamil Nadu,
More informationAN IMPROVED MULTI-SOM ALGORITHM
AN IMPROVED MULTI-SOM ALGORITHM ABSTRACT Imen Khanchouch 1, Khaddouja Boujenfa 2 and Mohamed Limam 3 1 LARODEC ISG, University of Tunis kh.imen88@gmail.com 2 LARODEC ISG, University of Tunis khadouja.boujenfa@isg.rnu.tn
More informationA REVIEW ON VARIOUS APPROACHES OF CLUSTERING IN DATA MINING
A REVIEW ON VARIOUS APPROACHES OF CLUSTERING IN DATA MINING Abhinav Kathuria Email - abhinav.kathuria90@gmail.com Abstract: Data mining is the process of the extraction of the hidden pattern from the data
More informationFinal Report - Smart and Fast Sorting
Final Report - Smart and Fast Email Sorting Antonin Bas - Clement Mennesson 1 Project s Description Some people receive hundreds of emails a week and sorting all of them into different categories (e.g.
More informationMachine Learning - Clustering. CS102 Fall 2017
Machine Learning - Fall 2017 Big Data Tools and Techniques Basic Data Manipulation and Analysis Performing well-defined computations or asking well-defined questions ( queries ) Data Mining Looking for
More informationBasic Data Mining Technique
Basic Data Mining Technique What is classification? What is prediction? Supervised and Unsupervised Learning Decision trees Association rule K-nearest neighbor classifier Case-based reasoning Genetic algorithm
More informationReview on Various Clustering Methods for the Image Data
Review on Various Clustering Methods for the Image Data Madhuri A. Tayal 1,M.M.Raghuwanshi 2 1 SRKNEC Nagpur, 2 NYSS Nagpur, 1, 2 Nagpur University Nagpur [Maharashtra], INDIA. 1 madhuri_kalpe@rediffmail.com,
More informationCHAPTER 4 FUZZY LOGIC, K-MEANS, FUZZY C-MEANS AND BAYESIAN METHODS
CHAPTER 4 FUZZY LOGIC, K-MEANS, FUZZY C-MEANS AND BAYESIAN METHODS 4.1. INTRODUCTION This chapter includes implementation and testing of the student s academic performance evaluation to achieve the objective(s)
More informationA Distance-Based Classifier Using Dissimilarity Based on Class Conditional Probability and Within-Class Variation. Kwanyong Lee 1 and Hyeyoung Park 2
A Distance-Based Classifier Using Dissimilarity Based on Class Conditional Probability and Within-Class Variation Kwanyong Lee 1 and Hyeyoung Park 2 1. Department of Computer Science, Korea National Open
More informationClassification Algorithms in Data Mining
August 9th, 2016 Suhas Mallesh Yash Thakkar Ashok Choudhary CIS660 Data Mining and Big Data Processing -Dr. Sunnie S. Chung Classification Algorithms in Data Mining Deciding on the classification algorithms
More informationCase-Based Reasoning. CS 188: Artificial Intelligence Fall Nearest-Neighbor Classification. Parametric / Non-parametric.
CS 188: Artificial Intelligence Fall 2008 Lecture 25: Kernels and Clustering 12/2/2008 Dan Klein UC Berkeley Case-Based Reasoning Similarity for classification Case-based reasoning Predict an instance
More informationCS 188: Artificial Intelligence Fall 2008
CS 188: Artificial Intelligence Fall 2008 Lecture 25: Kernels and Clustering 12/2/2008 Dan Klein UC Berkeley 1 1 Case-Based Reasoning Similarity for classification Case-based reasoning Predict an instance
More informationCSE 158. Web Mining and Recommender Systems. Midterm recap
CSE 158 Web Mining and Recommender Systems Midterm recap Midterm on Wednesday! 5:10 pm 6:10 pm Closed book but I ll provide a similar level of basic info as in the last page of previous midterms CSE 158
More informationFEATURE EXTRACTION USING FUZZY RULE BASED SYSTEM
International Journal of Computer Science and Applications, Vol. 5, No. 3, pp 1-8 Technomathematics Research Foundation FEATURE EXTRACTION USING FUZZY RULE BASED SYSTEM NARENDRA S. CHAUDHARI and AVISHEK
More informationA Comparative Study of Conventional and Neural Network Classification of Multispectral Data
A Comparative Study of Conventional and Neural Network Classification of Multispectral Data B.Solaiman & M.C.Mouchot Ecole Nationale Supérieure des Télécommunications de Bretagne B.P. 832, 29285 BREST
More informationINF4820, Algorithms for AI and NLP: Evaluating Classifiers Clustering
INF4820, Algorithms for AI and NLP: Evaluating Classifiers Clustering Erik Velldal University of Oslo Sept. 18, 2012 Topics for today 2 Classification Recap Evaluating classifiers Accuracy, precision,
More informationAnalysis of an airfoil
UNDERGRADUATE RESEARCH FALL 2010 Analysis of an airfoil using Computational Fluid Dynamics Tanveer Chandok 12/17/2010 Independent research thesis at the Georgia Institute of Technology under the supervision
More informationLand Cover Classification Techniques
Land Cover Classification Techniques supervised classification and random forests Developed by remote sensing specialists at the USFS Geospatial Technology and Applications Center (GTAC), located in Salt
More informationCluster Validity Classification Approaches Based on Geometric Probability and Application in the Classification of Remotely Sensed Images
Sensors & Transducers 04 by IFSA Publishing, S. L. http://www.sensorsportal.com Cluster Validity ification Approaches Based on Geometric Probability and Application in the ification of Remotely Sensed
More informationData Mining: STATISTICA
Outline Data Mining: STATISTICA Prepare the data Classification and regression (C & R, ANN) Clustering Association rules Graphic user interface Prepare the Data Statistica can read from Excel,.txt and
More informationOnline Adaptive Hierarchical Clustering in a Decision Tree Framework
Journal of Pattern Recognition Research 2 (2) 2-229 Received March 8, 2. Accepted May 7, 2. Online Adaptive Hierarchical Clustering in a Decision Tree Framework www.jprr.org Jayanta Basak NetApp India
More informationMulti prototype fuzzy pattern matching for handwritten character recognition
Multi prototype fuzzy pattern matching for handwritten character recognition MILIND E. RANE, DHABE P. S AND J. B. PATIL Dept. of Electronics and Computer, R.C. Patel Institute of Technology, Shirpur, Dist.
More informationStatistical Analysis of Metabolomics Data. Xiuxia Du Department of Bioinformatics & Genomics University of North Carolina at Charlotte
Statistical Analysis of Metabolomics Data Xiuxia Du Department of Bioinformatics & Genomics University of North Carolina at Charlotte Outline Introduction Data pre-treatment 1. Normalization 2. Centering,
More informationHybrid Models Using Unsupervised Clustering for Prediction of Customer Churn
Hybrid Models Using Unsupervised Clustering for Prediction of Customer Churn Indranil Bose and Xi Chen Abstract In this paper, we use two-stage hybrid models consisting of unsupervised clustering techniques
More information6. Dicretization methods 6.1 The purpose of discretization
6. Dicretization methods 6.1 The purpose of discretization Often data are given in the form of continuous values. If their number is huge, model building for such data can be difficult. Moreover, many
More informationA STUDY OF SOME DATA MINING CLASSIFICATION TECHNIQUES
A STUDY OF SOME DATA MINING CLASSIFICATION TECHNIQUES Narsaiah Putta Assistant professor Department of CSE, VASAVI College of Engineering, Hyderabad, Telangana, India Abstract Abstract An Classification
More informationFEATURE EXTRACTION TECHNIQUES USING SUPPORT VECTOR MACHINES IN DISEASE PREDICTION
FEATURE EXTRACTION TECHNIQUES USING SUPPORT VECTOR MACHINES IN DISEASE PREDICTION Sandeep Kaur 1, Dr. Sheetal Kalra 2 1,2 Computer Science Department, Guru Nanak Dev University RC, Jalandhar(India) ABSTRACT
More information4. Feedforward neural networks. 4.1 Feedforward neural network structure
4. Feedforward neural networks 4.1 Feedforward neural network structure Feedforward neural network is one of the most common network architectures. Its structure and some basic preprocessing issues required
More informationApplication of genetic algorithms and Kohonen networks to cluster analysis
Application of genetic algorithms and Kohonen networks to cluster analysis Marian B. Gorza lczany and Filip Rudziński Department of Electrical and Computer Engineering Kielce University of Technology Al.
More informationPredicting Bias in Machine Learned Classifiers Using Clustering
Predicting Bias in Machine Learned Classifiers Using Clustering Robert Thomson 1, Elie Alhajjar 1, Joshua Irwin 2, and Travis Russell 1 1 United States Military Academy, West Point NY 10996, USA {Robert.Thomson,Elie.Alhajjar,Travis.Russell}@usma.edu
More informationSemi-Supervised Clustering with Partial Background Information
Semi-Supervised Clustering with Partial Background Information Jing Gao Pang-Ning Tan Haibin Cheng Abstract Incorporating background knowledge into unsupervised clustering algorithms has been the subject
More informationClustering. Supervised vs. Unsupervised Learning
Clustering Supervised vs. Unsupervised Learning So far we have assumed that the training samples used to design the classifier were labeled by their class membership (supervised learning) We assume now
More informationApplying Supervised Learning
Applying Supervised Learning When to Consider Supervised Learning A supervised learning algorithm takes a known set of input data (the training set) and known responses to the data (output), and trains
More informationFunction Algorithms: Linear Regression, Logistic Regression
CS 4510/9010: Applied Machine Learning 1 Function Algorithms: Linear Regression, Logistic Regression Paula Matuszek Fall, 2016 Some of these slides originated from Andrew Moore Tutorials, at http://www.cs.cmu.edu/~awm/tutorials.html
More informationAdaptive Resolution Min-Max Classifiers
402 IEEE TRANSACTIONS ON NEURAL NETWORKS, VOL. 13, NO. 2, MARCH 2002 Adaptive Resolution Min-Max Classifiers Antonello Rizzi, Massimo Panella, and Fabio Massimo Frattale Mascioli Abstract A high automation
More informationLEARNING WEIGHTS OF FUZZY RULES BY USING GRAVITATIONAL SEARCH ALGORITHM
International Journal of Innovative Computing, Information and Control ICIC International c 2013 ISSN 1349-4198 Volume 9, Number 4, April 2013 pp. 1593 1601 LEARNING WEIGHTS OF FUZZY RULES BY USING GRAVITATIONAL
More informationData Mining. Part 2. Data Understanding and Preparation. 2.4 Data Transformation. Spring Instructor: Dr. Masoud Yaghini. Data Transformation
Data Mining Part 2. Data Understanding and Preparation 2.4 Spring 2010 Instructor: Dr. Masoud Yaghini Outline Introduction Normalization Attribute Construction Aggregation Attribute Subset Selection Discretization
More informationMODELLING DOCUMENT CATEGORIES BY EVOLUTIONARY LEARNING OF TEXT CENTROIDS
MODELLING DOCUMENT CATEGORIES BY EVOLUTIONARY LEARNING OF TEXT CENTROIDS J.I. Serrano M.D. Del Castillo Instituto de Automática Industrial CSIC. Ctra. Campo Real km.0 200. La Poveda. Arganda del Rey. 28500
More informationInternational Journal of Scientific Research & Engineering Trends Volume 4, Issue 6, Nov-Dec-2018, ISSN (Online): X
Analysis about Classification Techniques on Categorical Data in Data Mining Assistant Professor P. Meena Department of Computer Science Adhiyaman Arts and Science College for Women Uthangarai, Krishnagiri,
More informationCHAPTER 4 AN IMPROVED INITIALIZATION METHOD FOR FUZZY C-MEANS CLUSTERING USING DENSITY BASED APPROACH
37 CHAPTER 4 AN IMPROVED INITIALIZATION METHOD FOR FUZZY C-MEANS CLUSTERING USING DENSITY BASED APPROACH 4.1 INTRODUCTION Genes can belong to any genetic network and are also coordinated by many regulatory
More informationFX2-CHILLER. Digital Control. Operations Manual
FX2-CHILLER Digital Control Operations Manual Micro Air Corporation Phone (609) 259-2636 124 Route 526 www.microair.net Allentown NJ 08501 Fax (609) 259-6601 Introduction: The FX2-CHILLER digital control
More informationSupervised vs. Unsupervised Learning
Clustering Supervised vs. Unsupervised Learning So far we have assumed that the training samples used to design the classifier were labeled by their class membership (supervised learning) We assume now
More informationKTH ROYAL INSTITUTE OF TECHNOLOGY. Lecture 14 Machine Learning. K-means, knn
KTH ROYAL INSTITUTE OF TECHNOLOGY Lecture 14 Machine Learning. K-means, knn Contents K-means clustering K-Nearest Neighbour Power Systems Analysis An automated learning approach Understanding states in
More informationProperties of learning of a Fuzzy ART Variant
NN 38 PERGAMON Neural Networks 2 (999) 837 85 Neural Networks wwwelseviercom/locate/neunet Properties of learning of a Fuzzy ART Variant M Georgiopoulos a, *, I Dagher a, GL Heileman b, G Bebis c a Department
More informationUnsupervised Learning
Outline Unsupervised Learning Basic concepts K-means algorithm Representation of clusters Hierarchical clustering Distance functions Which clustering algorithm to use? NN Supervised learning vs. unsupervised
More informationDiscriminate Analysis
Discriminate Analysis Outline Introduction Linear Discriminant Analysis Examples 1 Introduction What is Discriminant Analysis? Statistical technique to classify objects into mutually exclusive and exhaustive
More informationARTICLE; BIOINFORMATICS Clustering performance comparison using K-means and expectation maximization algorithms
Biotechnology & Biotechnological Equipment, 2014 Vol. 28, No. S1, S44 S48, http://dx.doi.org/10.1080/13102818.2014.949045 ARTICLE; BIOINFORMATICS Clustering performance comparison using K-means and expectation
More information2009 E09PS E09PS E09PS E09PS E09PS E09PS38 IEEE 2009 E09PS39 E09PS40 E09PS41 E09PS42 E09PS43 IEEE 2008 E09PS44
1 CODE IEEE TRANSACTION POWER SYSTEM YEAR E09PS32 E09PS01 E09PS02 E09PS03 E09PS04 E09PS05 E09PS06 E09PS07 E09PS08 E09PS09 E09PS10 E09PS11 E09PS12 E09PS13 E09PS14 E09PS15 E09PS16 E09PS17 E09PS18 E09PS19
More informationHigh throughput Data Analysis 2. Cluster Analysis
High throughput Data Analysis 2 Cluster Analysis Overview Why clustering? Hierarchical clustering K means clustering Issues with above two Other methods Quality of clustering results Introduction WHY DO
More informationJarek Szlichta
Jarek Szlichta http://data.science.uoit.ca/ Approximate terminology, though there is some overlap: Data(base) operations Executing specific operations or queries over data Data mining Looking for patterns
More informationValidation of the Control Quality of Characteristic Field Based Fuzzy Controllers
Validation of the Control Quality of Characteristic Field Based Fuzzy Controllers R. Hampel Institute of Process Automation and Measuring Technique (IPM) University of Applied Sciences Zittau Theodor-Korner-Allee
More informationEvolutionary Instance Selection Algorithm based on Takagi-Sugeno Fuzzy Model
Appl. Math. Inf. Sci. 8, No. 3, 1307-1312 (2014) 1307 Applied Mathematics & Information Sciences An International Journal http://dx.doi.org/10.12785/amis/080346 Evolutionary Instance Selection Algorithm
More informationExperimental Study on Fault Detection Algorithm Using Regression Method for Plural Indoor Units Faults of Multi-Heat Pump System under Heating Mode
Purdue University Purdue e-pubs International Refrigeration and Air Conditioning Conference School of Mechanical Engineering 2012 Experimental Study on Fault Detection Algorithm Using Regression Method
More informationMass Classification Method in Mammogram Using Fuzzy K-Nearest Neighbour Equality
Mass Classification Method in Mammogram Using Fuzzy K-Nearest Neighbour Equality Abstract: Mass classification of objects is an important area of research and application in a variety of fields. In this
More informationOutline. Prepare the data Classification and regression Clustering Association rules Graphic user interface
Data Mining: i STATISTICA Outline Prepare the data Classification and regression Clustering Association rules Graphic user interface 1 Prepare the Data Statistica can read from Excel,.txt and many other
More informationK-Nearest-Neighbours with a Novel Similarity Measure for Intrusion Detection
K-Nearest-Neighbours with a Novel Similarity Measure for Intrusion Detection Zhenghui Ma School of Computer Science The University of Birmingham Edgbaston, B15 2TT Birmingham, UK Ata Kaban School of Computer
More informationMachine Learning using MapReduce
Machine Learning using MapReduce What is Machine Learning Machine learning is a subfield of artificial intelligence concerned with techniques that allow computers to improve their outputs based on previous
More informationMachine Learning (CSMML16) (Autumn term, ) Xia Hong
Machine Learning (CSMML16) (Autumn term, 28-29) Xia Hong 1 Useful books: 1. C. M. Bishop: Pattern Recognition and Machine Learning (2007) Springer. 2. S. Haykin: Neural Networks (1999) Prentice Hall. 3.
More informationConstructive Feedforward ART Clustering Networks Part I
IEEE TRANSACTIONS ON NEURAL NETWORKS, VOL. 13, NO. 3, MAY 2002 645 Constructive Feedforward ART Clustering Networks Part I Andrea Baraldi and Ethem Alpaydın Abstract Part I of this paper proposes a definition
More information3 Feature Selection & Feature Extraction
3 Feature Selection & Feature Extraction Overview: 3.1 Introduction 3.2 Feature Extraction 3.3 Feature Selection 3.3.1 Max-Dependency, Max-Relevance, Min-Redundancy 3.3.2 Relevance Filter 3.3.3 Redundancy
More information