Neural networks for variable star classification

Similar documents
Machine Learning : Clustering, Self-Organizing Maps

Supervised vs.unsupervised Learning

A Comparative Study of Conventional and Neural Network Classification of Multispectral Data

Unsupervised Learning

Neural Networks. CE-725: Statistical Pattern Recognition Sharif University of Technology Spring Soleymani

Figure (5) Kohonen Self-Organized Map

A Dendrogram. Bioinformatics (Lec 17)

Classification Lecture Notes cse352. Neural Networks. Professor Anita Wasilewska

Deep (1) Matthieu Cord LIP6 / UPMC Paris 6

CHAPTER 6 COUNTER PROPAGATION NEURAL NETWORK IN GAIT RECOGNITION

CS6220: DATA MINING TECHNIQUES

What is machine learning?


Pattern Recognition. Kjell Elenius. Speech, Music and Hearing KTH. March 29, 2007 Speech recognition

Chapter 7: Competitive learning, clustering, and self-organizing maps

WHAT TYPE OF NEURAL NETWORK IS IDEAL FOR PREDICTIONS OF SOLAR FLARES?

Modular network SOM : Theory, algorithm and applications

Support Vector Machines

Seismic regionalization based on an artificial neural network

Department of Electronics and Telecommunication Engineering 1 PG Student, JSPM s Imperial College of Engineering and Research, Pune (M.H.

Function approximation using RBF network. 10 basis functions and 25 data points.

Effect of Grouping in Vector Recognition System Based on SOM

Machine Learning. The Breadth of ML Neural Networks & Deep Learning. Marc Toussaint. Duy Nguyen-Tuong. University of Stuttgart

D.A. Karras 1, S.A. Karkanis 2, D. Iakovidis 2, D. E. Maroulis 2 and B.G. Mertzios , Greece,

Robustness of Selective Desensitization Perceptron Against Irrelevant and Partially Relevant Features in Pattern Classification

Neural Networks and Deep Learning

CHAPTER IX Radial Basis Function Networks

Unsupervised Learning

Machine Learning and Pervasive Computing

Support Vector Machines

self-organizing maps and symbolic data

ECG782: Multidimensional Digital Signal Processing

Efficient Method for Intrusion Detection in Multitenanat Data Center; A Review

Voronoi Region. K-means method for Signal Compression: Vector Quantization. Compression Formula 11/20/2013

CSE 5526: Introduction to Neural Networks Radial Basis Function (RBF) Networks

A Self Organizing Map for dissimilarity data 0

Artificial Neural Networks Unsupervised learning: SOM

A comparative study of local classifiers based on clustering techniques and one-layer neural networks

Week 3: Perceptron and Multi-layer Perceptron

Data analysis and inference for an industrial deethanizer

Opening the Black Box Data Driven Visualizaion of Neural N

CMU Lecture 18: Deep learning and Vision: Convolutional neural networks. Teacher: Gianni A. Di Caro

Introduction to Neural Networks

Review: Final Exam CPSC Artificial Intelligence Michael M. Richter

.. Spring 2017 CSC 566 Advanced Data Mining Alexander Dekhtyar..

arxiv: v1 [physics.data-an] 2 Aug 2011

LECTURE NOTES Professor Anita Wasilewska NEURAL NETWORKS

International Journal of Scientific Research & Engineering Trends Volume 4, Issue 6, Nov-Dec-2018, ISSN (Online): X

Some fast and compact neural network solutions for artificial intelligence applications

PATTERN RECOGNITION USING NEURAL NETWORKS

NEURAL NETWORK-BASED SEGMENTATION OF TEXTURES USING GABOR FEATURES

Character Recognition Using Convolutional Neural Networks

Visual object classification by sparse convolutional neural networks

DAY 52 BOX-AND-WHISKER

CSC321: Neural Networks. Lecture 13: Learning without a teacher: Autoencoders and Principal Components Analysis. Geoffrey Hinton

Global Probability of Boundary

COMBINED METHOD TO VISUALISE AND REDUCE DIMENSIONALITY OF THE FINANCIAL DATA SETS

Contents. Preface to the Second Edition

IMPLEMENTATION OF RBF TYPE NETWORKS BY SIGMOIDAL FEEDFORWARD NEURAL NETWORKS

Machine Learning 13. week

Self-Organizing Maps for cyclic and unbounded graphs

Automatic Classification of Audio Data

Cluster analysis of 3D seismic data for oil and gas exploration

Affine Arithmetic Self Organizing Map

Machine Learning in the Wild. Dealing with Messy Data. Rajmonda S. Caceres. SDS 293 Smith College October 30, 2017

Assignment 2. Classification and Regression using Linear Networks, Multilayer Perceptron Networks, and Radial Basis Functions

Image Segmentation. Ross Whitaker SCI Institute, School of Computing University of Utah

Rough Set Approach to Unsupervised Neural Network based Pattern Classifier

Deep Learning. Deep Learning provided breakthrough results in speech recognition and image classification. Why?

In this assignment, we investigated the use of neural networks for supervised classification

Data Mining. Kohonen Networks. Data Mining Course: Sharif University of Technology 1

Handwritten Character Recognition with Feedback Neural Network

Day 3 Lecture 1. Unsupervised Learning

Machine Learning in Biology

Image Segmentation. Ross Whitaker SCI Institute, School of Computing University of Utah

Slide07 Haykin Chapter 9: Self-Organizing Maps

Extreme Learning Machines. Tony Oakden ANU AI Masters Project (early Presentation) 4/8/2014

Multivariate Data Analysis and Machine Learning in High Energy Physics (V)

Texture classification using convolutional neural networks

Texture Classification by Combining Local Binary Pattern Features and a Self-Organizing Map

Multi Layer Perceptron with Back Propagation. User Manual

Extract an Essential Skeleton of a Character as a Graph from a Character Image

View Invariant Movement Recognition by using Adaptive Neural Fuzzy Inference System

Performance analysis of a MLP weight initialization algorithm

Intro to Artificial Intelligence

Back propagation Algorithm:

Data Mining: Concepts and Techniques. Chapter 9 Classification: Support Vector Machines. Support Vector Machines (SVMs)

Liquefaction Analysis in 3D based on Neural Network Algorithm

Online Learning for Object Recognition with a Hierarchical Visual Cortex Model

Growing Neural Gas A Parallel Approach

Constructive Neural Network Algorithm for Breast Cancer Detection

Supervised Learning with Neural Networks. We now look at how an agent might learn to solve a general problem by seeing examples.

Object Detection Lecture Introduction to deep learning (CNN) Idar Dyrdal

IN recent years, neural networks have attracted considerable attention

Improving A Trajectory Index For Topology Conserving Mapping

Traffic Sign Localization and Classification Methods: An Overview

Advanced Introduction to Machine Learning, CMU-10715

M. Sc. (Artificial Intelligence and Machine Learning)

Deep neural networks II

Construction Features and Data Analysis by BP-SOM Modular Neural Network

Transcription:

Neural networks for variable star classification Vasily Belokurov, IoA, Cambridge Supervised classification Multi-Layer Perceptron (MLP) Neural Networks for Pattern Recognition by C. Bishop Unsupervised learning and detection Self-Organizing Map (SOM) Self-organizing maps by T. Kohonen Bayesian methods for learning algorithms Information Theory, Inference and Learning Algorithms by D. MacKay

Multi-Layered Perceptrons for supervised classification Training set - a library of classified lightcurves. datasets from Hipparcos, OGLE etc Archive integration Feature extraction - lightcurves need to be parameterized Tools for spectral analysis of sparsely sampled data P(C x) - probability that the variability class is C given measured features x Find P(C x) - nonlinear function in many dimensions Neural Networks can approximate this mapping

Multi-Layered Perceptrons for supervised classification Feed Forward Networks Each connection has a weight assigned Network can have several layers of hidden (processing) units (neurons) Each neuron receives a weighted sum of values from the lower layer Each neuron then sends this linear sum transformed with a nonlinear activation function The output is calculated and can be compared to the target Error back-propagation Error function quantifies the difference between output and target Derivatives (with respect to the weights) of the error function are calculated by propagating the 'errors' from higher layers

Multi-Layered Perceptrons for supervised classification

Multi-Layered Perceptrons for supervised classification Stuttgart Neural Network Simulator

Multi-Layered Perceptrons for supervised classification Advantages Distribituted processing of all available information Approximates complicated decision boundaries in many dimensions Novelty detection Example MACHO LMC data 5 inputs (extracted from each lightcurve) Identifies 1 microlensing event in the tile of 2500 lightcurves Belokurov, Evans & Le Du 2003

Desirables Visualize data in more than 3 dimensions Preserve the topology SOM The weights of each map node define a reference vector of the same dimensionality as the input The node with the smallest Euclidean distance between its reference vector and the input defines response of the map to the input

Learning 1. Find the winner 2. Update the winner to minimize the distance 3. Allow nodes in the vicinity of the winner to be updated as well (defined by the neighborhood kernel) 4. Kernel monotonically decreases with time Quality of learning Quantization error Distortion measure Calibration Set reference points with manually analyzed data

Example - Hipparcos Variables 52 input features in total Lomb Periodogram integrated in 40 bins and normalized 5,15,25,35,45,55,65,75,85,95 percentiles with median magnitude subtracted Ratio of magnitude above/below median V-I Color evolution?... Dataset - more than 2500 'solved' Hipparcos variables Software - SOM_PAK (free)

50000 variables in the POINT-AGAPE dataset

Strategy 1 Train the map with the whole dataset available Calibrate with known (archived) variables Strategy 2 Train the map only with known variables For each new lightcurve find the location on the map Science Alerts Novelty detection - the black spots and walls - regions with low reference vector density Numbers on the x and y-axis do NOT carry any (usual) meaning. However, SOMs can be used for feature extraction and then the (x,y) positions on the map can be analysed with MLPs.