A Study on Clustering Method by Self-Organizing Map and Information Criteria
|
|
- Philomena Karin Gardner
- 5 years ago
- Views:
Transcription
1 A Study on Clustering Method by Self-Organizing Map and Information Criteria Satoru Kato, Tadashi Horiuchi,andYoshioItoh Matsue College of Technology, 4-4 Nishi-ikuma, Matsue, Shimane 90-88, JAPAN, Tottori University, 4-0 Koyama-cho minami, Tottori 80-80, JAPAN Abstract. In this paper, we propose a clustering method by and information criteria. In this method, initial cluster-candidates are derived by, and then these candidates are merged appropriately based on information criterion such as BIC or AIC (Akaike Information Criterion). Through the clustering experiments for the artificial datasets and UCI Machine Learning Repository s datasets, we confirm that our proposed method can extract clusters more accurately and stably than the only method. Introduction Clustering by Self-Organizing Map ([]) can extract clusters of arbitrary distribution shapes based on the distance between the code-vectors (representative points of the input data)[]. In recent, there are several improvemental methods which alter the basic algorithm [3][4]. Hence, this is one of the distancebased clustering approaches. On the other hand, there are distribution-based clustering approaches that consider the distribution of input data when extracting clusters appropriately. For example, x-means method [] adopts Bayesian Information Criterion (BIC) into k-means method. Information criteria are also easily introduced into the clustering method by. In this paper, we propose a clustering method by and information criteria. In the proposing method, initial cluster-candidates are derived by, and then these candidates are merged appropriately based on the information criterion such as BIC or AIC (Akaike Information Criterion). Through the clustering experiments for the artificial datasets and UCI Machine Learning Repository s datasets, we confirm that our proposed method can extract clusters more accurately and stably than the -only method. Furthermore, we show that AIC is suitable for the proposed method compared to BIC.
2 Clustering by. Basic algorithm, proposed by Kohonen, is configured as shown in Fig.. In the basic learning algorithm [], the code vectors are updated by using the following equations w i (t +)=w i (t)+α(t) Φ(p i )(x w i (t)) () Φ(p i )=exp ( p i ) σ (t) () Here α(t) is the learning coefficient after t learning steps. The coefficient starts from its initial value α ini, and then decreases monotonically as t increases, thus reaching its minimum at the pre-set maximum number of learning steps T. In addition, Φ(p i ) is a neighborhood function with the center at winner cell c, and p i is the distance from cell i to the winner cell c in the competitive layer. In Eq., σ(t) is a time-varying parameter that defines the neighborhood size in the competitive layer. Like α(t) in Eq., this parameter decreases monotonically from σ ini as learning proceeds. As a result of learning, the similarity between learning data is expressed by the closeness on the grid in the competitive layer. In addition, the data density in the input data space is reflected in the distribution of code vectors after learning.. Cluster extraction from In the maps built by learning, the code vectors between adjacent cells in the grid of competitive layer are similar, and the data density in the input layer is reflected in distribution of code vectors after learning. Using these features, as pointed out by Terashima et al. [], allows clustering by the detection of cluster boundaries as portions where the code vectors between adjacent cells are substantially different. The specific clustering procedure is presented below. In addition, one-dimensional is used for simplicity of analysis; m cells in the competitive layer are arranged in one-dimensional array.. Map building The input data are subjected to learning to obtain a set of code vectors.. Map analysis (a) For every cell i(i =,,..., m ), the code vector density dw i is found from following equations as the Euclidean distance between code vectors for cells i and i +: dw i = w i w i+ (3)
3 3 Neuron cell i Input layer Input vector : x Competitive layer Code vector : Wi Fig.. Basic structure one-dimensional (b) The code vector density dw i for every cell i(i =,,..., m ) is normalized to its maximum and minimum in the range 0, thus obtaining the normalized density dw i : dw i = dw i dw i min dw i max dw i min (4) (c) The histogram of dw i is derived. A cluster boundary is recognized between cell i corresponding to the histogram peak and its neighbor cell i Labeling The competitive layer is divided according to the dw i group of cells is labeled appropriately. histogram, and every 3 Proposed method 3. Basic idea There are many upward-peaks in the density histogram of code-vector. Each of these may indicate a boundary of clusters obviously or not, so that many cluster-candidates can be extracted from the density histogram. Basic idea of the proposing method is appropriate mergence of these cluster-candidates by using of information criteria under following procedures. A. Make code-vector density(dw i ) histogram after learning process of onedimensional. B. Extract cluster-candidates from the density histogram and assign continuous number to each candidate. These numbers are ordered correspondingly to that of neuron cells in the competitive-layer of. C. Decide which cluster-candidates should be merged from arbitrary pair of candidates whose numbers are adjoining each other. Fig. shows a practical sequence of the proposed method. Through the procedure A, we obtain Fig.(a) and (b). And Fig.(c) is obtained after the procedure B. Then, cluster-candidates are gradually merged by applying the procedure C over and over until the number of clusters is agree with original one. (See Fig.(d)-(f)).
4 4 Cell9 Cell3 Cell Cell Cell Cell7 Cell Input data Code-vector dw i Cell7 Cell Cell 0. Cell3 Cell Cell9 Cell Cell number (a) Distribution of code-vectors (b) Density histogram of code-vectors (c) Cluster-candidates (Initial state) (d) st. mergence (e) nd. mergence (f) 3rd. mergence Fig.. Clustering process by using proposed method 3. BIC and AIC When a distribution of data x is observed, a family of alternative model which generate the distribution can be considered. Information criterion is one of the useful guideline to determine which model is the most suitable. Bayesian Information Criterion (BIC) and Akaike Information Criterion (AIC) are typical one, calculated by the following equation respectively. BIC = logl(ˆθ; x)+q log n () AIC = logl(ˆθ; x)+q () Where, q is dimension of the parameter vector ˆθ and n is the number of samples of empirical distribution. And L( ) = f( ),wheref( ) isp-dimensional Gaussiandistribution: f(ˆθ; x) =(π) p V exp { } (x V (x μ) (7) μ)t
5 In Eq.() and Eq.(), first term is logarithmic likelihood when a model described by parameter ˆθ is applied to the empirical distribution of x and second term indicates complexity of the model which is called penalty term. 3.3 Cluster mergence by using information criterion The procedure for selective cluster mergence (see Sec.3. procedure C) is divided into the following processes in practice. Here, it is assumed that the procedure A and B in Sec.3. have already finished. C. Merge a pair of adjoining cluster-candidates temporally, and calculate the two values what we call IC single and IC twin by using either Eq.() or Eq.(). Where, IC single and IC twin means the value of either BIC or AIC when an applied distribution model for the unified clusters is single-distribution or twin-distribution respectively. C. Calculate ΔIC which means a difference between IC single and IC twin by the following equation. ΔIC = IC single IC twin (8) C3. After calculation of ΔIC for all pairs of adjoining cluster candidates, find the pair which has minimum ΔIC and merge two cluster candidates included in the pair conclusively. Then, consecutive numbers for the cluster candidates including the new cluster are refreshed. C4. Repeat the procedure C to C3 until the number of clusters reaches a specified value. IC single < IC twin when fitting the single-distribution to the unified clusters is more suitable than the case of twin-distribution. Therefore, ΔIC can measure a degree of propriety to merge two adjoining clusters. 4 Clustering experiments 4. Experimental method We use four kinds of data distribution as experimental dataset. Two datasets are generated artificially to consist of two or three clusters whose density or distribution shape is different. Each of another two datasets is UCI Iris and BCW dataset from UCI ML repository[] as examples of practical data. Performance evaluation is carried out by using a degree of the classification error. Classification error is calculated by comparing the indices of the original dataset with which is obtained from the clustering result. When applying learning algorithm in proposed method, we set the iteration of learning is 00 times of the number of the input data and the number of cells of the competitive layer is set to,0,,30 or 3. We make 00 trials for each setting of learning and apply cluster mergence procedure with either BIC or AIC for each learning result so that 00 kinds of clustering result (00 trials patterns of the size of competitive layer) for each dataset.
6 0.8 Class Class Class 3 4 Class Class (a) Artificial dataset (Different densities, 3 clusters) (b) Artificial dataset (Distorted distributions, clusters) nd. principal component Iris Setosa Iris Versicolor Iris Verginica nd. Principal component BCW_class(benign) BCW_class(malignant) st. principal component st. Principal component (c) UCI Iris data (d) UCI BCW data Fig. 3. Artificial and practical dataset for the clustering experiments 4. Experimental result Fig.4 shows the result of performance evaluation of clustering for each dataset. We calculate the average value of classification error through 00 trials for each the five pattern of s competitive layer size. So that in the legend of each figure, Worst, Average and Best indicate maximum, average and minimum value of the average classification error respectively among the five patterns of settings. +BIC and +AIC correspond to proposed method and only is conventional method which extracts clusters from histogram of codevector density such as shown in Fig.(b) with appropriate threshold setting. In the case of artificial dataset and UCI BCW dataset, only method shows very high classification error. These dataset are including clusters whose density is quite different each other and it is hard to estimate the boundary of clusters correctly only by using code-vector density histogram. On the other
7 7 Classification error (%) 0 Worst Average Best 40 49% Classification error (%) Worst Average 4 Best % 0.78% 0.78% 3.4% 0.9% 0.9%.% k-means +BIC +AIC only k-means +BIC +AIC only (a) Artificial dataset (b) Artificial dataset Classification error (%) Classification error (%) 0 40 Worst Average Best 0 Worst Average Best 3% % 7.8% 4.9% 0% 0 3.9%.%.3% k-means +BIC +AIC only k-means +BIC +AIC only (c) UCI Iris data (d) UCI BCW data Fig. 4. Comparison of clustering performance hand, proposed method can extract each cluster in the dataset more accurately than another methods except k-means method in the case of BCW dataset. BCW dataset contains comparatively high-dimensional data (each data has 0 attributes), therefore distribution-based approaches such as +BIC and +AIC may not be able to estimate parameters such as μ and V in the Eq.(7) correctly of the distribution model Looking at the classification error of +BIC and +AIC, both these methods show almost same clustering performance excepting the case of UCI Iris dataset. In Eq.(), the penalty term includes variable of the number of samples n. AndΔIC becomes small if the value of n is large. Hence in the case of +BIC method, one cluster candidate which has a large number of samples tends to take adjoining candidates one after another.
8 8 Conclusion In this paper, we tried to combine clustering methodology with appropriate cluster mergence approach by using information criteria such as BIC and AIC. Since it can pay attention to a naturalness for each data distribution as a cluster, proposed method can extract clusters more correctly than conventional methods especially when the dataset consists of clusters whose density is different each other. From the results of clustering experiments using several kind of artificial and practical dataset, proposed method shows less classification error than other conventional method such as k-means and -based simple clustering method. Furthermore, we confirmed that AIC is suitable for the proposed method compared to BIC. It is necessary to use more kind of practical dataset for examination of effectiveness of the proposed method as a future work. References. T. Kohonen: Self-Organizing Maps, 3 rd ed., Springer-Verlag Berlin (00). M. Terashima, F. Shiratani, K. Yamamoto: Unsupervised Cluster Segmentation Method Using Data Density Histogram on Self-Organizing Feature Map, IEICE Trans., Vol.J79-D-II, No.7, pp (99)(in Japanese) 3. S. Kato, K. Koike, T. Horiuchi and Y. Itoh: A Study on Two-Stage Self-Organizing Map Suitable for Clustering Problems Proceedings of the 00 International Symposium on Intelligent Signal Processing and Communication Systems, pp.77 80, H. Matsushita and Y. Nishio: Reunifying Self-Organizing Map and Disconnecting Self-Organizing Map, RISP Journal of Signal Processing, Vol., no., pp.44 4, D. Pelleg, and A. Moore, X-means: Extending K-means with Efficient Estimation of the Number of Clusters, Proc. of the 7th International Conference on Machine Learning, pp , UCI Machine Learning Repository, mlearn/mlrepository.html
Influence of Neighbor Size for Initial Node Exchange of SOM Learning
FR-E3-3 SCIS&ISIS2006 @ Tokyo, Japan (September 20-24, 2006) Influence of Neighbor Size for Initial Node Exchange of SOM Learning MIYOSHI Tsutomu Department of Information and Knowledge Engineering, Tottori
More informationSelf-Organizing Maps for cyclic and unbounded graphs
Self-Organizing Maps for cyclic and unbounded graphs M. Hagenbuchner 1, A. Sperduti 2, A.C. Tsoi 3 1- University of Wollongong, Wollongong, Australia. 2- University of Padova, Padova, Italy. 3- Hong Kong
More informationProduction of Video Images by Computer Controlled Cameras and Its Application to TV Conference System
Proc. of IEEE Conference on Computer Vision and Pattern Recognition, vol.2, II-131 II-137, Dec. 2001. Production of Video Images by Computer Controlled Cameras and Its Application to TV Conference System
More informationTime Series Prediction as a Problem of Missing Values: Application to ESTSP2007 and NN3 Competition Benchmarks
Series Prediction as a Problem of Missing Values: Application to ESTSP7 and NN3 Competition Benchmarks Antti Sorjamaa and Amaury Lendasse Abstract In this paper, time series prediction is considered as
More informationProjection of undirected and non-positional graphs using self organizing maps
University of Wollongong Research Online Faculty of Engineering and Information Sciences - Papers: Part A Faculty of Engineering and Information Sciences 2009 Projection of undirected and non-positional
More informationFunction approximation using RBF network. 10 basis functions and 25 data points.
1 Function approximation using RBF network F (x j ) = m 1 w i ϕ( x j t i ) i=1 j = 1... N, m 1 = 10, N = 25 10 basis functions and 25 data points. Basis function centers are plotted with circles and data
More informationTwo-step Modified SOM for Parallel Calculation
Two-step Modified SOM for Parallel Calculation Two-step Modified SOM for Parallel Calculation Petr Gajdoš and Pavel Moravec Petr Gajdoš and Pavel Moravec Department of Computer Science, FEECS, VŠB Technical
More informationA Population Based Convergence Criterion for Self-Organizing Maps
A Population Based Convergence Criterion for Self-Organizing Maps Lutz Hamel and Benjamin Ott Department of Computer Science and Statistics, University of Rhode Island, Kingston, RI 02881, USA. Email:
More informationTexture Classification by Combining Local Binary Pattern Features and a Self-Organizing Map
Texture Classification by Combining Local Binary Pattern Features and a Self-Organizing Map Markus Turtinen, Topi Mäenpää, and Matti Pietikäinen Machine Vision Group, P.O.Box 4500, FIN-90014 University
More informationSeismic regionalization based on an artificial neural network
Seismic regionalization based on an artificial neural network *Jaime García-Pérez 1) and René Riaño 2) 1), 2) Instituto de Ingeniería, UNAM, CU, Coyoacán, México D.F., 014510, Mexico 1) jgap@pumas.ii.unam.mx
More informationFeature Subset Selection for Logistic Regression via Mixed Integer Optimization
Feature Subset Selection for Logistic Regression via Mixed Integer Optimization Yuichi TAKANO (Senshu University, Japan) Toshiki SATO (University of Tsukuba) Ryuhei MIYASHIRO (Tokyo University of Agriculture
More informationRobust Event Boundary Detection in Sensor Networks A Mixture Model Based Approach
Robust Event Boundary Detection in Sensor Networks A Mixture Model Based Approach Min Ding Department of Computer Science The George Washington University Washington DC 20052, USA Email: minding@gwu.edu
More informationImage Segmentation Using Iterated Graph Cuts BasedonMulti-scaleSmoothing
Image Segmentation Using Iterated Graph Cuts BasedonMulti-scaleSmoothing Tomoyuki Nagahashi 1, Hironobu Fujiyoshi 1, and Takeo Kanade 2 1 Dept. of Computer Science, Chubu University. Matsumoto 1200, Kasugai,
More informationCluster Analysis using Spherical SOM
Cluster Analysis using Spherical SOM H. Tokutaka 1, P.K. Kihato 2, K. Fujimura 2 and M. Ohkita 2 1) SOM Japan Co-LTD, 2) Electrical and Electronic Department, Tottori University Email: {tokutaka@somj.com,
More informationApplication of genetic algorithms and Kohonen networks to cluster analysis
Application of genetic algorithms and Kohonen networks to cluster analysis Marian B. Gorza lczany and Filip Rudziński Department of Electrical and Computer Engineering Kielce University of Technology Al.
More informationBinarization of Color Character Strings in Scene Images Using K-means Clustering and Support Vector Machines
2011 International Conference on Document Analysis and Recognition Binarization of Color Character Strings in Scene Images Using K-means Clustering and Support Vector Machines Toru Wakahara Kohei Kita
More informationSOM+EOF for Finding Missing Values
SOM+EOF for Finding Missing Values Antti Sorjamaa 1, Paul Merlin 2, Bertrand Maillet 2 and Amaury Lendasse 1 1- Helsinki University of Technology - CIS P.O. Box 5400, 02015 HUT - Finland 2- Variances and
More informationSupervised vs.unsupervised Learning
Supervised vs.unsupervised Learning In supervised learning we train algorithms with predefined concepts and functions based on labeled data D = { ( x, y ) x X, y {yes,no}. In unsupervised learning we are
More informationFuzzy Modeling using Vector Quantization with Supervised Learning
Fuzzy Modeling using Vector Quantization with Supervised Learning Hirofumi Miyajima, Noritaka Shigei, and Hiromi Miyajima Abstract It is known that learning methods of fuzzy modeling using vector quantization
More informationRelation Organization of SOM Initial Map by Improved Node Exchange
JOURNAL OF COMPUTERS, VOL. 3, NO. 9, SEPTEMBER 2008 77 Relation Organization of SOM Initial Map by Improved Node Echange MIYOSHI Tsutomu Department of Information and Electronics, Tottori University, Tottori,
More informationAn Empirical Study of Hoeffding Racing for Model Selection in k-nearest Neighbor Classification
An Empirical Study of Hoeffding Racing for Model Selection in k-nearest Neighbor Classification Flora Yu-Hui Yeh and Marcus Gallagher School of Information Technology and Electrical Engineering University
More informationModular network SOM : Theory, algorithm and applications
Modular network SOM : Theory, algorithm and applications Kazuhiro Tokunaga and Tetsuo Furukawa Kyushu Institute of Technology, Kitakyushu 88-96, Japan {tokunaga, furukawa}@brain.kyutech.ac.jp Abstract.
More informationUnsupervised Learning
Networks for Pattern Recognition, 2014 Networks for Single Linkage K-Means Soft DBSCAN PCA Networks for Kohonen Maps Linear Vector Quantization Networks for Problems/Approaches in Machine Learning Supervised
More informationProbabilistic Facial Feature Extraction Using Joint Distribution of Location and Texture Information
Probabilistic Facial Feature Extraction Using Joint Distribution of Location and Texture Information Mustafa Berkay Yilmaz, Hakan Erdogan, Mustafa Unel Sabanci University, Faculty of Engineering and Natural
More informationMoving Object Segmentation Method Based on Motion Information Classification by X-means and Spatial Region Segmentation
IJCSNS International Journal of Computer Science and Network Security, VOL.13 No.11, November 2013 1 Moving Object Segmentation Method Based on Motion Information Classification by X-means and Spatial
More informationSelf-Organizing Feature Map. Kazuhiro MINAMIMOTO Kazushi IKEDA Kenji NAKAYAMA
Topology Analysis of Data Space Using Self-Organizing Feature Map Kazuhiro MINAMIMOTO Kazushi IKEDA Kenji NAKAYAMA Department of Electrical and Computer Eng., Faculty of Eng., Kanazawa Univ. 2-4-2, Kodatsuno,
More informationSimilarity Image Retrieval System Using Hierarchical Classification
Similarity Image Retrieval System Using Hierarchical Classification Experimental System on Mobile Internet with Cellular Phone Masahiro Tada 1, Toshikazu Kato 1, and Isao Shinohara 2 1 Department of Industrial
More informationImage Segmentation Using Iterated Graph Cuts Based on Multi-scale Smoothing
Image Segmentation Using Iterated Graph Cuts Based on Multi-scale Smoothing Tomoyuki Nagahashi 1, Hironobu Fujiyoshi 1, and Takeo Kanade 2 1 Dept. of Computer Science, Chubu University. Matsumoto 1200,
More informationThe Projected Dip-means Clustering Algorithm
Theofilos Chamalis Department of Computer Science & Engineering University of Ioannina GR 45110, Ioannina, Greece thchama@cs.uoi.gr ABSTRACT One of the major research issues in data clustering concerns
More informationColor Image Segmentation
Color Image Segmentation Yining Deng, B. S. Manjunath and Hyundoo Shin* Department of Electrical and Computer Engineering University of California, Santa Barbara, CA 93106-9560 *Samsung Electronics Inc.
More informationAutomatic Group-Outlier Detection
Automatic Group-Outlier Detection Amine Chaibi and Mustapha Lebbah and Hanane Azzag LIPN-UMR 7030 Université Paris 13 - CNRS 99, av. J-B Clément - F-93430 Villetaneuse {firstname.secondname}@lipn.univ-paris13.fr
More informationSwarm Based Fuzzy Clustering with Partition Validity
Swarm Based Fuzzy Clustering with Partition Validity Lawrence O. Hall and Parag M. Kanade Computer Science & Engineering Dept University of South Florida, Tampa FL 33620 @csee.usf.edu Abstract
More informationReducing topological defects in self-organizing maps using multiple scale neighborhood functions
Reducing topological defects in self-organizing maps using multiple scale neighborhood functions Kazushi Murakoshi,YuichiSato Department of Knowledge-based Information Engineering, Toyohashi University
More informationPerformance Analysis of Data Mining Classification Techniques
Performance Analysis of Data Mining Classification Techniques Tejas Mehta 1, Dr. Dhaval Kathiriya 2 Ph.D. Student, School of Computer Science, Dr. Babasaheb Ambedkar Open University, Gujarat, India 1 Principal
More informationTo be Bernoulli or to be Gaussian, for a Restricted Boltzmann Machine
2014 22nd International Conference on Pattern Recognition To be Bernoulli or to be Gaussian, for a Restricted Boltzmann Machine Takayoshi Yamashita, Masayuki Tanaka, Eiji Yoshida, Yuji Yamauchi and Hironobu
More informationEvaluation of the Performance of O(log 2 M) Self-Organizing Map Algorithm without Neighborhood Learning
04 IJCSNS International Journal of Computer Science and Network Security, VOL.6 No.0, October 006 Evaluation of the Performance of O(log M) Self-Organizing Map Algorithm without Neighborhood Learning Hiroki
More informationBRACE: A Paradigm For the Discretization of Continuously Valued Data
Proceedings of the Seventh Florida Artificial Intelligence Research Symposium, pp. 7-2, 994 BRACE: A Paradigm For the Discretization of Continuously Valued Data Dan Ventura Tony R. Martinez Computer Science
More informationFuzzy-Kernel Learning Vector Quantization
Fuzzy-Kernel Learning Vector Quantization Daoqiang Zhang 1, Songcan Chen 1 and Zhi-Hua Zhou 2 1 Department of Computer Science and Engineering Nanjing University of Aeronautics and Astronautics Nanjing
More informationUnsupervised Learning for Hierarchical Clustering Using Statistical Information
Unsupervised Learning for Hierarcical Clustering Using Statistical Information Masaru Okamoto, Nan Bu, and Tosio Tsuji Department of Artificial Complex System Engineering Hirosima University Kagamiyama
More informationCHAPTER 4 AN IMPROVED INITIALIZATION METHOD FOR FUZZY C-MEANS CLUSTERING USING DENSITY BASED APPROACH
37 CHAPTER 4 AN IMPROVED INITIALIZATION METHOD FOR FUZZY C-MEANS CLUSTERING USING DENSITY BASED APPROACH 4.1 INTRODUCTION Genes can belong to any genetic network and are also coordinated by many regulatory
More informationCellular Learning Automata-Based Color Image Segmentation using Adaptive Chains
Cellular Learning Automata-Based Color Image Segmentation using Adaptive Chains Ahmad Ali Abin, Mehran Fotouhi, Shohreh Kasaei, Senior Member, IEEE Sharif University of Technology, Tehran, Iran abin@ce.sharif.edu,
More informationCloNI: clustering of JN -interval discretization
CloNI: clustering of JN -interval discretization C. Ratanamahatana Department of Computer Science, University of California, Riverside, USA Abstract It is known that the naive Bayesian classifier typically
More informationFlexible-Hybrid Sequential Floating Search in Statistical Feature Selection
Flexible-Hybrid Sequential Floating Search in Statistical Feature Selection Petr Somol 1,2, Jana Novovičová 1,2, and Pavel Pudil 2,1 1 Dept. of Pattern Recognition, Institute of Information Theory and
More informationFAST HUMAN DETECTION USING TEMPLATE MATCHING FOR GRADIENT IMAGES AND ASC DESCRIPTORS BASED ON SUBTRACTION STEREO
FAST HUMAN DETECTION USING TEMPLATE MATCHING FOR GRADIENT IMAGES AND ASC DESCRIPTORS BASED ON SUBTRACTION STEREO Makoto Arie, Masatoshi Shibata, Kenji Terabayashi, Alessandro Moro and Kazunori Umeda Course
More informationInformation Criteria Methods in SAS for Multiple Linear Regression Models
Paper SA5 Information Criteria Methods in SAS for Multiple Linear Regression Models Dennis J. Beal, Science Applications International Corporation, Oak Ridge, TN ABSTRACT SAS 9.1 calculates Akaike s Information
More informationRobust Lip Contour Extraction using Separability of Multi-Dimensional Distributions
Robust Lip Contour Extraction using Separability of Multi-Dimensional Distributions Tomokazu Wakasugi, Masahide Nishiura and Kazuhiro Fukui Corporate Research and Development Center, Toshiba Corporation
More informationSome questions of consensus building using co-association
Some questions of consensus building using co-association VITALIY TAYANOV Polish-Japanese High School of Computer Technics Aleja Legionow, 4190, Bytom POLAND vtayanov@yahoo.com Abstract: In this paper
More informationSparsity issues in self-organizing-maps for structures
University of Wollongong Research Online Faculty of Informatics - Papers (Archive) Faculty of Engineering and Information Sciences 2011 Sparsity issues in self-organizing-maps for structures Markus Hagenbuchner
More informationImage Segmentation for Image Object Extraction
Image Segmentation for Image Object Extraction Rohit Kamble, Keshav Kaul # Computer Department, Vishwakarma Institute of Information Technology, Pune kamble.rohit@hotmail.com, kaul.keshav@gmail.com ABSTRACT
More informationEffect of Grouping in Vector Recognition System Based on SOM
Effect of Grouping in Vector Recognition System Based on SOM Masayoshi Ohta Graduate School of Science and Engineering Kansai University Osaka, Japan Email: k287636@kansai-u.ac.jp Yuto Kurosaki Department
More informationSegmentation and Object Detection with Gabor Filters and Cumulative Histograms
Segmentation and Object Detection with Gabor Filters and Cumulative Histograms Tadayoshi SHIOYAMA, Haiyuan WU and Shigetomo MITANI Department of Mechanical and System Engineering Kyoto Institute of Technology
More informationEfficient Pruning Method for Ensemble Self-Generating Neural Networks
Efficient Pruning Method for Ensemble Self-Generating Neural Networks Hirotaka INOUE Department of Electrical Engineering & Information Science, Kure National College of Technology -- Agaminami, Kure-shi,
More informationArtificial Neural Networks Unsupervised learning: SOM
Artificial Neural Networks Unsupervised learning: SOM 01001110 01100101 01110101 01110010 01101111 01101110 01101111 01110110 01100001 00100000 01110011 01101011 01110101 01110000 01101001 01101110 01100001
More informationChapter 7: Competitive learning, clustering, and self-organizing maps
Chapter 7: Competitive learning, clustering, and self-organizing maps António R. C. Paiva EEL 6814 Spring 2008 Outline Competitive learning Clustering Self-Organizing Maps What is competition in neural
More informationMulti-Clustering Centers Approach to Enhancing the Performance of SOM Clustering Ability
JOURNAL OF INFORMATION SCIENCE AND ENGINEERING 25, 1087-1102 (2009) Multi-Clustering Centers Approach to Enhancing the Performance of SOM Clustering Ability CHING-HWANG WANG AND CHIH-HAN KAO * Department
More informationImage Classification Using Wavelet Coefficients in Low-pass Bands
Proceedings of International Joint Conference on Neural Networks, Orlando, Florida, USA, August -7, 007 Image Classification Using Wavelet Coefficients in Low-pass Bands Weibao Zou, Member, IEEE, and Yan
More informationTHE discrete multi-valued neuron was presented by N.
Proceedings of International Joint Conference on Neural Networks, Dallas, Texas, USA, August 4-9, 2013 Multi-Valued Neuron with New Learning Schemes Shin-Fu Wu and Shie-Jue Lee Department of Electrical
More informationAlgorithm That Mimics Human Perceptual Grouping of Dot Patterns
Algorithm That Mimics Human Perceptual Grouping of Dot Patterns G. Papari and N. Petkov Institute of Mathematics and Computing Science, University of Groningen, P.O.Box 800, 9700 AV Groningen, The Netherlands
More informationCluster Analysis and Visualization. Workshop on Statistics and Machine Learning 2004/2/6
Cluster Analysis and Visualization Workshop on Statistics and Machine Learning 2004/2/6 Outlines Introduction Stages in Clustering Clustering Analysis and Visualization One/two-dimensional Data Histogram,
More informationFree Projection SOM: A New Method For SOM-Based Cluster Visualization
Free Projection SOM: A New Method For SOM-Based Cluster Visualization 1 ABDEL-BADEEH M. SALEM 1, EMAD MONIER, KHALED NAGATY Computer Science Department Ain Shams University Faculty of Computer & Information
More informationOnline algorithms for clustering problems
University of Szeged Department of Computer Algorithms and Artificial Intelligence Online algorithms for clustering problems Summary of the Ph.D. thesis by Gabriella Divéki Supervisor Dr. Csanád Imreh
More informationFurther Applications of a Particle Visualization Framework
Further Applications of a Particle Visualization Framework Ke Yin, Ian Davidson Department of Computer Science SUNY-Albany 1400 Washington Ave. Albany, NY, USA, 12222. Abstract. Our previous work introduced
More informationPart I. Hierarchical clustering. Hierarchical Clustering. Hierarchical clustering. Produces a set of nested clusters organized as a
Week 9 Based in part on slides from textbook, slides of Susan Holmes Part I December 2, 2012 Hierarchical Clustering 1 / 1 Produces a set of nested clusters organized as a Hierarchical hierarchical clustering
More informationShape Modeling of A String And Recognition Using Distance Sensor
Proceedings of the 24th IEEE International Symposium on Robot and Human Interactive Communication Kobe, Japan, Aug 31 - Sept 4, 2015 Shape Modeling of A String And Recognition Using Distance Sensor Keisuke
More informationEnsemble Combination for Solving the Parameter Selection Problem in Image Segmentation
Ensemble Combination for Solving the Parameter Selection Problem in Image Segmentation Pakaket Wattuya and Xiaoyi Jiang Department of Mathematics and Computer Science University of Münster, Germany {wattuya,xjiang}@math.uni-muenster.de
More informationKANSEI Based Clothing Fabric Image Retrieval
KANSEI Based Clothing Fabric Image Retrieval Yen-Wei Chen 1,2, Shota Sobue 2, and Xinyin Huang 3 1 Elect & Inf. Eng. School, Central South Univ. of Forest and Tech., Changsha, China 2 Graduate School of
More informationDESIGN OF KOHONEN SELF-ORGANIZING MAP WITH REDUCED STRUCTURE
DESIGN OF KOHONEN SELF-ORGANIZING MAP WITH REDUCED STRUCTURE S. Kajan, M. Lajtman Institute of Control and Industrial Informatics, Faculty of Electrical Engineering and Information Technology, Slovak University
More informationNonlinear dimensionality reduction of large datasets for data exploration
Data Mining VII: Data, Text and Web Mining and their Business Applications 3 Nonlinear dimensionality reduction of large datasets for data exploration V. Tomenko & V. Popov Wessex Institute of Technology,
More informationControlling the spread of dynamic self-organising maps
Neural Comput & Applic (2004) 13: 168 174 DOI 10.1007/s00521-004-0419-y ORIGINAL ARTICLE L. D. Alahakoon Controlling the spread of dynamic self-organising maps Received: 7 April 2004 / Accepted: 20 April
More informationA novel firing rule for training Kohonen selforganising
A novel firing rule for training Kohonen selforganising maps D. T. Pham & A. B. Chan Manufacturing Engineering Centre, School of Engineering, University of Wales Cardiff, P.O. Box 688, Queen's Buildings,
More informationTowards Automatic Recognition of Fonts using Genetic Approach
Towards Automatic Recognition of Fonts using Genetic Approach M. SARFRAZ Department of Information and Computer Science King Fahd University of Petroleum and Minerals KFUPM # 1510, Dhahran 31261, Saudi
More informationImproving Classifier Performance by Imputing Missing Values using Discretization Method
Improving Classifier Performance by Imputing Missing Values using Discretization Method E. CHANDRA BLESSIE Assistant Professor, Department of Computer Science, D.J.Academy for Managerial Excellence, Coimbatore,
More informationA NEW ALGORITHM FOR OPTIMIZING THE SELF- ORGANIZING MAP
A NEW ALGORITHM FOR OPTIMIZING THE SELF- ORGANIZING MAP BEN-HDECH Adil, GHANOU Youssef, EL QADI Abderrahim Team TIM, High School of Technology, Moulay Ismail University, Meknes, Morocco E-mail: adilbenhdech@gmail.com,
More informationSYDE Winter 2011 Introduction to Pattern Recognition. Clustering
SYDE 372 - Winter 2011 Introduction to Pattern Recognition Clustering Alexander Wong Department of Systems Design Engineering University of Waterloo Outline 1 2 3 4 5 All the approaches we have learned
More informationMachine Learning Based Autonomous Network Flow Identifying Method
Machine Learning Based Autonomous Network Flow Identifying Method Hongbo Shi 1,3, Tomoki Hamagami 1,3, and Haoyuan Xu 2,3 1 Division of Physics, Electrical and Computer Engineering, Graduate School of
More informationSOMSN: An Effective Self Organizing Map for Clustering of Social Networks
SOMSN: An Effective Self Organizing Map for Clustering of Social Networks Fatemeh Ghaemmaghami Research Scholar, CSE and IT Dept. Shiraz University, Shiraz, Iran Reza Manouchehri Sarhadi Research Scholar,
More informationParameter Selection for EM Clustering Using Information Criterion and PDDP
Parameter Selection for EM Clustering Using Information Criterion and PDDP Ujjwal Das Gupta,Vinay Menon and Uday Babbar Abstract This paper presents an algorithm to automatically determine the number of
More informationPlanar Symmetry Detection by Random Sampling and Voting Process
Planar Symmetry Detection by Random Sampling and Voting Process Atsushi Imiya, Tomoki Ueno, and Iris Fermin Dept. of IIS, Chiba University, 1-33, Yayo-cho, Inage-ku, Chiba, 263-8522, Japan imiya@ics.tj.chiba-u.ac.jp
More informationA faster model selection criterion for OP-ELM and OP-KNN: Hannan-Quinn criterion
A faster model selection criterion for OP-ELM and OP-KNN: Hannan-Quinn criterion Yoan Miche 1,2 and Amaury Lendasse 1 1- Helsinki University of Technology - ICS Lab. Konemiehentie 2, 02015 TKK - Finland
More informationCOMBINED METHOD TO VISUALISE AND REDUCE DIMENSIONALITY OF THE FINANCIAL DATA SETS
COMBINED METHOD TO VISUALISE AND REDUCE DIMENSIONALITY OF THE FINANCIAL DATA SETS Toomas Kirt Supervisor: Leo Võhandu Tallinn Technical University Toomas.Kirt@mail.ee Abstract: Key words: For the visualisation
More informationCOMPARISON OF DENSITY-BASED CLUSTERING ALGORITHMS
COMPARISON OF DENSITY-BASED CLUSTERING ALGORITHMS Mariam Rehman Lahore College for Women University Lahore, Pakistan mariam.rehman321@gmail.com Syed Atif Mehdi University of Management and Technology Lahore,
More informationAn Efficient Approach for Color Pattern Matching Using Image Mining
An Efficient Approach for Color Pattern Matching Using Image Mining * Manjot Kaur Navjot Kaur Master of Technology in Computer Science & Engineering, Sri Guru Granth Sahib World University, Fatehgarh Sahib,
More informationMachine Learning (BSMC-GA 4439) Wenke Liu
Machine Learning (BSMC-GA 4439) Wenke Liu 01-31-017 Outline Background Defining proximity Clustering methods Determining number of clusters Comparing two solutions Cluster analysis as unsupervised Learning
More informationFigure (5) Kohonen Self-Organized Map
2- KOHONEN SELF-ORGANIZING MAPS (SOM) - The self-organizing neural networks assume a topological structure among the cluster units. - There are m cluster units, arranged in a one- or two-dimensional array;
More information11/14/2010 Intelligent Systems and Soft Computing 1
Lecture 8 Artificial neural networks: Unsupervised learning Introduction Hebbian learning Generalised Hebbian learning algorithm Competitive learning Self-organising computational map: Kohonen network
More informationStability Assessment of Electric Power Systems using Growing Neural Gas and Self-Organizing Maps
Stability Assessment of Electric Power Systems using Growing Gas and Self-Organizing Maps Christian Rehtanz, Carsten Leder University of Dortmund, 44221 Dortmund, Germany Abstract. Liberalized competitive
More informationPATTERN RECOGNITION USING NEURAL NETWORKS
PATTERN RECOGNITION USING NEURAL NETWORKS Santaji Ghorpade 1, Jayshree Ghorpade 2 and Shamla Mantri 3 1 Department of Information Technology Engineering, Pune University, India santaji_11jan@yahoo.co.in,
More informationAn Efficient Model Selection for Gaussian Mixture Model in a Bayesian Framework
IEEE SIGNAL PROCESSING LETTERS, VOL. XX, NO. XX, XXX 23 An Efficient Model Selection for Gaussian Mixture Model in a Bayesian Framework Ji Won Yoon arxiv:37.99v [cs.lg] 3 Jul 23 Abstract In order to cluster
More informationA Simple Automated Void Defect Detection for Poor Contrast X-ray Images of BGA
Proceedings of the 3rd International Conference on Industrial Application Engineering 2015 A Simple Automated Void Defect Detection for Poor Contrast X-ray Images of BGA Somchai Nuanprasert a,*, Sueki
More informationModified Self-Organizing Mixture Network for Probability Density Estimation and Classification
Proceedings of International Joint Conference on Neural Networks, Dallas, Texas, USA, August 4-9, 2013 Modified Self-Organizing Mixture Network for Probability Density Estimation and Classification Lin
More informationAn Efficient Method for Extracting Fuzzy Classification Rules from High Dimensional Data
Published in J. Advanced Computational Intelligence, Vol., No., 997 An Efficient Method for Extracting Fuzzy Classification Rules from High Dimensional Data Stephen L. Chiu Rockwell Science Center 049
More informationComputational Statistics The basics of maximum likelihood estimation, Bayesian estimation, object recognitions
Computational Statistics The basics of maximum likelihood estimation, Bayesian estimation, object recognitions Thomas Giraud Simon Chabot October 12, 2013 Contents 1 Discriminant analysis 3 1.1 Main idea................................
More informationExtract an Essential Skeleton of a Character as a Graph from a Character Image
Extract an Essential Skeleton of a Character as a Graph from a Character Image Kazuhisa Fujita University of Electro-Communications 1-5-1 Chofugaoka, Chofu, Tokyo, 182-8585 Japan k-z@nerve.pc.uec.ac.jp
More informationThe Analysis of Traffic of IP Packets using CGH. Self Organizing Map
2015 International Conference on Computational Science and Computational Intelligence The Analysis of Traffic of IP Packets using CGH Self Organizing Maps Hiroshi Dozono Department of Advanced Fusion Saga
More informationA NON-ADAPTIVE DISTRIBUTED SYSTEM-LEVEL DIAGNOSIS METHOD FOR COMPUTER NETWORKS
A NON-ADAPIVE DISRIBUED SYSEM-LEVEL DIAGNOSIS MEHOD FOR COMPUER NEWORKS Hiroshi MASUYAMA and sutomu MIYOSHI Information and Knowledge Engineering, ottori University Koyama-cho Minami -, ottori, 0- Japan
More informationAn Approach for Fuzzy Modeling based on Self-Organizing Feature Maps Neural Network
Appl. Math. Inf. Sci. 8, No. 3, 27-2 (24) 27 Applied Mathematics & Information Sciences An International Journal http://dx.doi.org/.278/amis/8334 An Approach for Fuzzy Modeling based on Self-Organizing
More informationECG782: Multidimensional Digital Signal Processing
ECG782: Multidimensional Digital Signal Processing Object Recognition http://www.ee.unlv.edu/~b1morris/ecg782/ 2 Outline Knowledge Representation Statistical Pattern Recognition Neural Networks Boosting
More informationBioimage Informatics
Bioimage Informatics Lecture 14, Spring 2012 Bioimage Data Analysis (IV) Image Segmentation (part 3) Lecture 14 March 07, 2012 1 Outline Review: intensity thresholding based image segmentation Morphological
More informationHardware Realization of Panoramic Camera with Direction of Speaker Estimation and a Panoramic Image Generation Function
Proceedings of the th WSEAS International Conference on Simulation, Modelling and Optimization, Beijing, China, September -, 00 Hardware Realization of Panoramic Camera with Direction of Speaker Estimation
More informationMineral Exploation Using Neural Netowrks
ABSTRACT I S S N 2277-3061 Mineral Exploation Using Neural Netowrks Aysar A. Abdulrahman University of Sulaimani, Computer Science, Kurdistan Region of Iraq aysser.abdulrahman@univsul.edu.iq Establishing
More information