Outline. Self-Organizing Maps (SOM) US Hebbian Learning, Cntd. The learning rule is Hebbian like:

Similar documents
KOHONEN'S SELF ORGANIZING NETWORKS WITH "CONSCIENCE"

Machine Learning 9. week

Hierarchical clustering for gene expression data analysis

Subspace clustering. Clustering. Fundamental to all clustering techniques is the choice of distance measure between data points;

Classifying Acoustic Transient Signals Using Artificial Intelligence

Recognition of Tifinagh Characters Using Self Organizing Map And Fuzzy K-Nearest Neighbor

Biological Sequence Mining Using Plausible Neural Network and its Application to Exon/intron Boundaries Prediction

Learning the Kernel Parameters in Kernel Minimum Distance Classifier

6.854 Advanced Algorithms Petar Maymounkov Problem Set 11 (November 23, 2005) With: Benjamin Rossman, Oren Weimann, and Pouya Kheradpour

Outline. Discriminative classifiers for image recognition. Where in the World? A nearest neighbor recognition example 4/14/2011. CS 376 Lecture 22 1

HYPERSPECTRAL IMAGE CLASSIFICATION USING A SELF-ORGANIZING MAP . (2)

LECTURE : MANIFOLD LEARNING

The Greedy Method. Outline and Reading. Change Money Problem. Greedy Algorithms. Applications of the Greedy Strategy. The Greedy Method Technique

Machine Learning: Algorithms and Applications

Outline. Type of Machine Learning. Examples of Application. Unsupervised Learning

CS 534: Computer Vision Model Fitting

Time Series Prediction Using RSOM and Local Models

Face Recognition University at Buffalo CSE666 Lecture Slides Resources:

Application of Learning Machine Methods to 3 D Object Modeling

Unsupervised Learning

Data Mining MTAT (4AP = 6EAP)

Topics. Clustering. Unsupervised vs. Supervised. Vehicle Example. Vehicle Clusters Advanced Algorithmics

Brief Review of Self-Organizing Maps

Announcements. Supervised Learning

K-means and Hierarchical Clustering

A SCALABLE DIGITAL ARCHITECTURE OF A KOHONEN NEURAL NETWORK

Machine Learning. Topic 6: Clustering

Classifier Selection Based on Data Complexity Measures *

Selection of Streets from a Network Using Self-Organizing Maps

Lecture 5: Multilayer Perceptrons

Recognizing Faces. Outline

12/2/2009. Announcements. Parametric / Non-parametric. Case-Based Reasoning. Nearest-Neighbor on Images. Nearest-Neighbor Classification

Unsupervised Learning and Clustering

Kent State University CS 4/ Design and Analysis of Algorithms. Dept. of Math & Computer Science LECT-16. Dynamic Programming

Discriminative Dictionary Learning with Pairwise Constraints

Programming in Fortran 90 : 2017/2018

Multi-stable Perception. Necker Cube

Support Vector Machines

Support Vector Machines

Smoothing Spline ANOVA for variable screening

APPLIED MACHINE LEARNING

SHAPE RECOGNITION METHOD BASED ON THE k-nearest NEIGHBOR RULE

Performance Assessment and Fault Diagnosis for Hydraulic Pump Based on WPT and SOM

Texture Feature Extraction Inspired by Natural Vision System and HMAX Algorithm

Fitting & Matching. Lecture 4 Prof. Bregler. Slides from: S. Lazebnik, S. Seitz, M. Pollefeys, A. Effros.

NUMERICAL SOLVING OPTIMAL CONTROL PROBLEMS BY THE METHOD OF VARIATIONS

SIGGRAPH Interactive Image Cutout. Interactive Graph Cut. Interactive Graph Cut. Interactive Graph Cut. Hard Constraints. Lazy Snapping.

A Knowledge Management System for Organizing MEDLINE Database

An Intelligent Tool for Building E-Learning Contend- Material Using Natural Language in Digital Libraries

Biostatistics 615/815

GSLM Operations Research II Fall 13/14

Positive Semi-definite Programming Localization in Wireless Sensor Networks

A PATTERN RECOGNITION APPROACH TO IMAGE SEGMENTATION

Natural Computing. Lecture 13: Particle swarm optimisation INFR /11/2010

Collaboratively Regularized Nearest Points for Set Based Recognition

Steps for Computing the Dissimilarity, Entropy, Herfindahl-Hirschman and. Accessibility (Gravity with Competition) Indices

A Deflected Grid-based Algorithm for Clustering Analysis

Lecture #15 Lecture Notes

Hermite Splines in Lie Groups as Products of Geodesics

Unsupervised Learning and Clustering

Content Based Image Retrieval Using 2-D Discrete Wavelet with Texture Feature with Different Classifiers

Detecting Compounded Anomalous SNMP Situations Using Cooperative Unsupervised Pattern Recognition

Local Adaptive Receptive Field Self-organizing Map for Image Segmentation

Wavefront Reconstructor

Lecture 36 of 42. Expectation Maximization (EM), Unsupervised Learning and Clustering

Cluster Analysis of Electrical Behavior

Face Recognition Based on Neuro-Fuzzy System

Face Recognition Method Based on Within-class Clustering SVM

Edge Detection in Noisy Images Using the Support Vector Machines

Analysis of EEG of shooters

CSE 326: Data Structures Quicksort Comparison Sorting Bound

FEATURE EXTRACTION. Dr. K.Vijayarekha. Associate Dean School of Electrical and Electronics Engineering SASTRA University, Thanjavur

Feature Reduction and Selection

CSE 326: Data Structures Quicksort Comparison Sorting Bound

Sequential search. Building Java Programs Chapter 13. Sequential search. Sequential search

Unsupervised Neural Network Adaptive Resonance Theory 2 for Clustering

Visual Thesaurus for Color Image Retrieval using Self-Organizing Maps

A Binarization Algorithm specialized on Document Images and Photos

Supervised Learning in Parallel Universes Using Neighborgrams

Why Neural Networks? An Enduring Synthesis. Neural Networks. After s: Hebb, McCulloch and Pitts

Optimal connection strategies in one- and two-dimensional associative memory models

Robust Classification of ph Levels on a Camera Phone

Data Mining For Multi-Criteria Energy Predictions

Image Fusion based on Wavelet and Curvelet Transform using ANFIS Algorithm

CS246: Mining Massive Datasets Jure Leskovec, Stanford University

Simulation Based Analysis of FAST TCP using OMNET++

APPLYING PRINCIPAL COMPONENT ANALYSIS, MULTILAYER PERCEPTRON AND SELF-ORGANIZING MAPS FOR OPTICAL CHARACTER RECOGNITION

Course Introduction. Algorithm 8/31/2017. COSC 320 Advanced Data Structures and Algorithms. COSC 320 Advanced Data Structures and Algorithms

Computer Animation and Visualisation. Lecture 4. Rigging / Skinning

Hierarchical agglomerative. Cluster Analysis. Christine Siedle Clustering 1

Parallelism for Nested Loops with Non-uniform and Flow Dependences

Load Balancing for Hex-Cell Interconnection Network

Support Vector Machines. CS534 - Machine Learning

Structure Formation of Social Network

Chinese Word Segmentation based on the Improved Particle Swarm Optimization Neural Networks

S1 Note. Basis functions.

Angle-Independent 3D Reconstruction. Ji Zhang Mireille Boutin Daniel Aliaga

The Research of Support Vector Machine in Agricultural Data Classification

Reading. 14. Subdivision curves. Recommended:

AMath 483/583 Lecture 21 May 13, Notes: Notes: Jacobi iteration. Notes: Jacobi with OpenMP coarse grain

Transcription:

Self-Organzng Maps (SOM) Turgay İBRİKÇİ, PhD. Outlne Introducton Structures of SOM SOM Archtecture Neghborhoods SOM Algorthm Examples Summary 1 2 Unsupervsed Hebban Learnng US Hebban Learnng, Cntd 3 A lnear unt: V = W j X j The learnng rule s Hebban lke: ΔW j = ηv j ( X j j VW ) The change n weght depends on the product of the neuron s output and nput, wth a term that makes the weghts decrease 4 Such a net converges nto a weght that maxmzes the average on V 2 Ths means that the weght ponts at the frst prncpal component of the data The network learns a feature of the data wthout any pror knowledge Ths s called feature extracton Unsupervsed Compettve Learnng Smple Compettve Learnng Algorthm In Hebban networks, all neurons can fre at the same tme Compettve learnng means that only a sngle neuron from each group fres at each tme step Output unts compete wth one another. These are wnner takes all-wta unts (grandmother cells) - Intalze weghts of samples from the nput Leaky learnng: also update the weghts of the losers (but wth a smaller h) Arrange neurons n a geometrcal way: update also neghbors Turn on nput patterns gradually Conscence mechansm Add nose to nput patterns 5 6 1

Smple Compettve Learnng Network Actvaton N nputs unts P output neurons P x N weghts h = 1, 2... P N = W X j = 1 j j x1 x2 WP1 W11 W12 W22 Y1 Y2 The unt wth the hghest feld h fres * s the wnner unt Geometrcally s closest to the current nput W * The wnnng unt s weght s updated to be even closer to the current nput Y = 1or 0 xn WPN YP 7 8 Learnng Introducton of SOM Startng wth small random weghts, at each step: 1. a new nput s presented to the network 2. all felds are calculated to fnd a wnner 3. s updated to be closer to the nput W * SOM s the neuron network Development n 1980s by Teuvo Kohonen n fnland SOM s the Unsupervsed Learnng SOM look lkes Grd (Smple for lookng) Usng SOM for Clusterng 9 10 Some Hstorcal Notes Prncples of Self Organzaton Local orderng (von der Malsbyrg, 1973) (Amar, 1980) a matematcal analyss elucdates the dynamc stablty of a cortcal map Self-organzng feature map (SOM), (Kohonen 1982) (Erwn, 1992) a convex neghbourhood functon should be used (~ Gaussan) The relatonshp between the SOM and prncpal curves s dcussed (Rtter, 1992 & Cherkassky and Muler, 1995) 1. Modfcatons n synaptc weghts tend to self amplfy 2. Lmtaton of resources lead to competton among synapses 3. Modfcatons n synaptc weghts tend to cooperate 4. Order and structure n actvaton patterns represent redundant nformaton that s transformed nto knowledge by the network 11 12 2

13 Determne the wnner (the neuron of whch the weght has the smallest dstance to the nput ) Move the weght w of the wnnng neuron towards the nput w Before learnng w After learnng 14 Impose a topologcal order onto the compettve neurons (e.g., rectangular map) Let neghbors of the wnner share the prze (The postcode lottery prncple) After learnng, neurons wth smlar weghts tend to cluster on the map Another vew of 15 Concept of the SOM Structures of SOM Input space Input layer Reduced feature space Map layer X X X X Ba s 1 lnear X X X X X X Sr Mn s 2 X X X X X X X X X X X X X X X X 17 Cluster centers (code s) Place of these code s n the reduced space Clusterng and orderng of the cluster centers n a two dmensonal grd 18 Rectangular X X X X X X X X X X X X X X X X X Hexagonal 3

SOM Archtecture SOM Archtecture Two layers of unts Input: n unts (length of tranng s) Output: m unts (number of categores) Input unts fully connected wth weghts to output unts neghborhood wnnng node Intralayer (lateral) connectons Wthn output layer Defned accordng to some topology, no weghts for those connectons Adaptve process changes weghts to more closely nputs nput (n-dmensonal) 19 20 SOM Archtecture (2) Neghborhoods Based on the lattce structure, each neuron has a neghborhood of adjacent neurons. Member of neghbors -lnear has 2 nearest neghbors - rectangular has 8 nearest neghbors - hexagonal has 6 nearest neghbors. When a neuron wns a competton, the weghts of the neurons n ts neghborhood are adjusted so the learnng rule s appled locally. Set of Neghborhoods Wnnng node 21 22 SOM Algorthm Compettve process Intalze weghts randomly set weghts for each node n the map Compettve process fnd the wnnng node Cooperatve process determne the neghborhood of the wnnng neuron at ths tme step. Weght adjustment process nput matrx x = [x 1, x 2, x 3,, x m ] weght matrx w = [ w 1, w 2, w 3,, w n ] dstant d = x w, = { 1,2,3, } wnnng node d = mnmum x w weghts of the wnnng neuron are strengthen and weghts n the neghborng neurons are suppressed. 23 24 4

Cooperatve process Weght adjustment process The topologcal neghborhood s centered on the wnnng neuron and falls off as the dstance to neghborng neuron ncreases. weght adjustment formula w j (new) = w j (current) + w j w( new) = w( n) + η( n)[ x w( n)] j j j 25 26 SOM Learnng Algorthm Pseudocode Example 1. Randomly ntalse all weghts and topology (1d or 2d) 2. Select nput x = [x 1, x 2, x 3,, x n ] from tranng set 3. Compare x wth weghts w j for each neuron j to 2 d j = ( w j x ) 4. Determne wnner and fd fnd unt j wth h the mnmum dstance 5. Update wnner so that t becomes more lke x, together wth the wnner s neghbours for unts wthn the radus accordng to wj( n+ 1) = wj( n) + η( n)[ x wj( n 6. Adjust parameters: learnng rate & neghbourhood functon 7. Repeat from (2) untl? )] Form a self-organzng map (SOM) to cluster a set of four s. 1 0 1 0 0 1 0 0 0 0 0 1 1 1 0 1 We wll look for 2 clusters for ths set. 27 Note that: Learnng rate generally decreases wth tme: 0 < η( n) η( n 1) 1 28 Trace Trace (2) Input d 1 = 1.0245 d 2 = 2.6537 Input d 1 = 1.3788 d 2 = 2.0637 29 Weght 0.9742 0.3668 0.1641 0.9575 0.0579 0.3529 0.8132 0.0099 30 Weght 0.3897 0.7467 0.0656 0.9830 0.0579 0.3529 0.8132 0.0099 5

Trace (3) Trace (4) Input d 1 = 1.9006 d 2 = 1.6734 Input d 1 = 1.5827 d 2 = 1.8556 31 Weght 0.3897 0.7467 0.0656 0.9830 0.6232 0.1411 0.3253 0.0039 32 Weght 0.1559 0.2987 0.6263 0.9932 0.6232 0.1411 0.3253 0.0039 Trace of Weghts Fnal clusters Intal 0.9355 0.0579 0.9169 0.3529 0.4103 0.8132 0.8936 0.0099 Epoch 20 0.3345 0.0000 0.6655 0.0000 1.0000 0.3345 Epoch 1 0.1559 0.6232 0.2987 0.1411 0.6263 0.3253 Epoch 2 0.1083 0.8470 0.2612 0.0573 0.6359 0.1321 Epoch 5 0.0006 0.9992 0.2968 0.0003 0.7026 0.0007 0.9932 0.0039 0.9995 0.0016 1.0000 0.2954 Epoch 50 0.3868 0.0000 0.6132 0.0000 1.0000 0.3868 Epoch 100 0.4365 0.0000 0.5635 0.0000 1.0000 0.4365 Epoch 1000 0.4987 0.0000 0.5013 0.0000 1.0000 0.4987 fnal cluster assgnment of the nputs: Input Vectors 1 0 1 0 0 1 0 0 0 0 0 1 1 1 0 1 Fnal weght 0.4987 0.0000 0.5013 0.0000 1.0000 0.4987 Input d 1 d 2 cluster 1 1.5000 0.2513 2 2 0.5025 2.2513 1 3 2.5000 0.2487 2 4 0.4975 2.2513 1 33 34 Analyss Input Vectors 1 0 1 0 0 1 0 0 0 0 0 1 1 1 0 1 Cluster = 2 1 2 1 Cluster 2 1 1 0 0 0 0 1 0 Cluster 1 0 0 1 0 0 1 1 1 35 36 6

37 38 39 40 Example for Cluster the color 41 42 7

Examples of Applcatons Summary-I 43 Kohonen (1984). Speech recognton - a map of phonemes n the Fnsh language Optcal character recognton - clusterng of letters of dfferent fonts Angelol etal (1988) travellng salesman problem (an optmzaton problem) Kohonen (1990) learnng quantzaton (pattern classfcaton problem) Rtter & Kohonen (1989) semantc maps 44 Unsupervsed learnng s very common US learnng requres redundancy n the stmul Self organzaton s a basc property of the bran s computatonal structure The SOM uses an unsupervsed clusterng process. Propertes of the fnal output map correspond to statstcally related nput data. Summary-II Questons & Answers Almost usng SOM for Clusterng the colors SOMs are based on competton (wnner takes all-wta unts) cooperaton synaptc adaptaton SOMs conserve topologcal relatonshps between the stmul Artfcal SOMs have many applcatons n computatonal neuroscence 45 46 8