WHAT TYPE OF NEURAL NETWORK IS IDEAL FOR PREDICTIONS OF SOLAR FLARES?

Similar documents
Multilayer Feed-forward networks

CHAPTER VI BACK PROPAGATION ALGORITHM

Time Series prediction with Feed-Forward Neural Networks -A Beginners Guide and Tutorial for Neuroph. Laura E. Carter-Greaves

Image Compression: An Artificial Neural Network Approach

Opening the Black Box Data Driven Visualizaion of Neural N

6. NEURAL NETWORK BASED PATH PLANNING ALGORITHM 6.1 INTRODUCTION

LECTURE NOTES Professor Anita Wasilewska NEURAL NETWORKS

Deep Learning. Practical introduction with Keras JORDI TORRES 27/05/2018. Chapter 3 JORDI TORRES

Notes on Multilayer, Feedforward Neural Networks

Neural Networks. By Laurence Squires

CS6220: DATA MINING TECHNIQUES

CHAPTER 7 MASS LOSS PREDICTION USING ARTIFICIAL NEURAL NETWORK (ANN)

Supervised Learning in Neural Networks (Part 2)

Assignment # 5. Farrukh Jabeen Due Date: November 2, Neural Networks: Backpropation

ARTIFICIAL NEURAL NETWORK CIRCUIT FOR SPECTRAL PATTERN RECOGNITION

Review on Methods of Selecting Number of Hidden Nodes in Artificial Neural Network

11/14/2010 Intelligent Systems and Soft Computing 1

Fault Diagnosis in Turning Operation with Neural Network Approach

Yuki Osada Andrew Cannon

turning data into dollars

Proceedings of the 2016 International Conference on Industrial Engineering and Operations Management Detroit, Michigan, USA, September 23-25, 2016

DEVELOPMENT OF NEURAL NETWORK TRAINING METHODOLOGY FOR MODELING NONLINEAR SYSTEMS WITH APPLICATION TO THE PREDICTION OF THE REFRACTIVE INDEX

International Journal of Advanced Research in Computer Science and Software Engineering

Week 3: Perceptron and Multi-layer Perceptron

Ensemble methods in machine learning. Example. Neural networks. Neural networks

Classification Lecture Notes cse352. Neural Networks. Professor Anita Wasilewska

CS 4510/9010 Applied Machine Learning. Neural Nets. Paula Matuszek Fall copyright Paula Matuszek 2016

A neural network that classifies glass either as window or non-window depending on the glass chemistry.

Data Mining. Neural Networks

Artificial Neural Networks Lecture Notes Part 5. Stephen Lucci, PhD. Part 5

Neural Networks (pp )

Natural Language Processing CS 6320 Lecture 6 Neural Language Models. Instructor: Sanda Harabagiu

CHAPTER 3 ADAPTIVE NEURO-FUZZY INFERENCE SYSTEM

Machine Learning 13. week

CMPT 882 Week 3 Summary

Recognizing Handwritten Digits Using the LLE Algorithm with Back Propagation

Neural Nets. General Model Building

Radial Basis Function (RBF) Neural Networks Based on the Triple Modular Redundancy Technology (TMR)

IMPLEMENTATION OF FPGA-BASED ARTIFICIAL NEURAL NETWORK (ANN) FOR FULL ADDER. Research Scholar, IIT Kharagpur.

3.2 Graphs of Linear Equations

For Monday. Read chapter 18, sections Homework:

Neural networks for variable star classification

Object Detection Lecture Introduction to deep learning (CNN) Idar Dyrdal

CHAPTER 6 DETECTION OF MASS USING NOVEL SEGMENTATION, GLCM AND NEURAL NETWORKS

CS 4510/9010 Applied Machine Learning

Exercise: Training Simple MLP by Backpropagation. Using Netlab.

4.12 Generalization. In back-propagation learning, as many training examples as possible are typically used.

The use of Neural Networks in Text Categorization. Abhishek Sethi

Applying Neural Network Architecture for Inverse Kinematics Problem in Robotics

Artificial Neural Network and Multi-Response Optimization in Reliability Measurement Approximation and Redundancy Allocation Problem

Global Journal of Engineering Science and Research Management

Lecture 2 Notes. Outline. Neural Networks. The Big Idea. Architecture. Instructors: Parth Shah, Riju Pahwa

Optimum size of feed forward neural network for Iris data set.

Python & Web Mining. Lecture Old Dominion University. Department of Computer Science CS 495 Fall 2012

Character Recognition Using Convolutional Neural Networks

Creating Situational Awareness with Spacecraft Data Trending and Monitoring

Analytical model A structure and process for analyzing a dataset. For example, a decision tree is a model for the classification of a dataset.

Knowledge Discovery and Data Mining. Neural Nets. A simple NN as a Mathematical Formula. Notes. Lecture 13 - Neural Nets. Tom Kelsey.

Knowledge Discovery and Data Mining

A *69>H>N6 #DJGC6A DG C<>C::G>C<,8>:C8:H /DA 'D 2:6G, ()-"&"3 -"(' ( +-" " " % '.+ % ' -0(+$,

Neural Networks. CE-725: Statistical Pattern Recognition Sharif University of Technology Spring Soleymani

Graph Neural Network. learning algorithm and applications. Shujia Zhang

CMU Lecture 18: Deep learning and Vision: Convolutional neural networks. Teacher: Gianni A. Di Caro

Report: Privacy-Preserving Classification on Deep Neural Network

Edge Detection for Dental X-ray Image Segmentation using Neural Network approach

Motivation. Problem: With our linear methods, we can train the weights but not the basis functions: Activator Trainable weight. Fixed basis function

INVESTIGATING DATA MINING BY ARTIFICIAL NEURAL NETWORK: A CASE OF REAL ESTATE PROPERTY EVALUATION

Artificial Neural Network Methodology for Modelling and Forecasting Maize Crop Yield

NEURAL- David Furrer* Ladish Co. Inc. Cudahy, Wisconsin. Stephen Thaler Imagination Engines Inc. Maryland Heights, Missouri

Adaptive Regularization. in Neural Network Filters

3 Nonlinear Regression

Optimizing Completion Techniques with Data Mining

Transactions on Information and Communications Technologies vol 16, 1996 WIT Press, ISSN

Seismic regionalization based on an artificial neural network

USING NEURAL NETWORK FOR PREDICTION OF THE DYNAMIC PERIOD AND AMPLIFICATION FACTOR OF SOIL FOR MICROZONATION IN URBAN AREA

Jan Rabaey Homework # 7 Solutions EECS141

The Mathematics Behind Neural Networks

A Novel Technique for Optimizing the Hidden Layer Architecture in Artificial Neural Networks N. M. Wagarachchi 1, A. S.

Back propagation Algorithm:

Neural Network Application Design. Supervised Function Approximation. Supervised Function Approximation. Supervised Function Approximation

! References: ! Computer eyesight gets a lot more accurate, NY Times. ! Stanford CS 231n. ! Christopher Olah s blog. ! Take ECS 174!

Climate Precipitation Prediction by Neural Network

Feed Forward Neural Network for Solid Waste Image Classification

Hybrid Method with ANN application for Linear Programming Problem

arxiv:physics/ v1 [physics.ins-det] 5 Mar 2007 Gate Arrays

Machine Learning Techniques for the Smart Grid Modeling of Solar Energy using AI

Course on Artificial Intelligence and Intelligent Systems

CHAPTER 8 COMPOUND CHARACTER RECOGNITION USING VARIOUS MODELS

Feedback Alignment Algorithms. Lisa Zhang, Tingwu Wang, Mengye Ren

Applying Supervised Learning

The Fly & Anti-Fly Missile

Keras: Handwritten Digit Recognition using MNIST Dataset

Simulation of Zhang Suen Algorithm using Feed- Forward Neural Networks

Lecture #11: The Perceptron

[February 1, 2017] Neural Networks for Dummies Rolf van Gelder, Eindhoven, NL

International Journal of Scientific Research & Engineering Trends Volume 4, Issue 6, Nov-Dec-2018, ISSN (Online): X

A Neural Network for Real-Time Signal Processing

Chapter 1 Section 1 Lesson: Solving Linear Equations

Liquefaction Analysis in 3D based on Neural Network Algorithm

A NEURAL NETWORK BASED TRAFFIC-FLOW PREDICTION MODEL. Bosnia Herzegovina. Denizli 20070, Turkey. Buyukcekmece, Istanbul, Turkey

Transcription:

WHAT TYPE OF NEURAL NETWORK IS IDEAL FOR PREDICTIONS OF SOLAR FLARES? Initially considered for this model was a feed forward neural network. Essentially, this means connections between units do not form a directed cycle. Back-propagation is more useful in our case. With back propagation, the input data is repeatedly presented to the neural network. With each presentation the output of the neural network is compared to the desired output and an error is computed. This error is then fed back (back propagated) to the neural network and used to adjust the weights such that the error decreases with each iteration and the neural mode gets closer and closer to producing the desired out put. This process is known as training. Essentially, it simply requires a data set of the desired output for many inputs, making up a training set. Within the training set, a learning rule updates the weights of the artificial neurons.

ADDITIONAL ARCHITECTURAL WORK The learning rate is one of the parameters that governs how fast a neural network learns and how effective the training is. Essentially it is the size of the steps the algorithm will take when minimizing the error function in an iterative process. Weights are used to change the parameters of the throughput and the carrying connections to the neurons.

ACTUAL NETWORK This was an actual Neural Network trained with data from GOES. Training for flares greater than M5 Supervised learning was used, which just means that each neuron consists of an input object and a desired output value. This learning algorithm, analyzes data and produces an inferred functioned, called a classifier. The data inputs were F-10.7 Radio Flux, X- forecast, M-forecast, and soft X-ray background (in that order), from May, 5th, 1996 to Jan 1st, 2008. The Data values for the inputs were normalized within IDL because the network only accepted values between 0 and 1. This was done by finding minimum and maximum values of the array (getting the minimum to zero and the maximum to 1).

ACTUAL NETWORK PART 2 The simplest way to describe this system: The inputs are fed into the input layer and get multiplied by interconnection weights as they are passed from the input layer to the first hidden layer. Within the first hidden later, they get summed then processed by a nonlinear function (Sigmoid function in our case but it can be a hyperbolic tangent). Sigmoid functions are ideal for making sure signals remain within a specified range. As the processed data leaves the first hidden later, again it gets multiplied by interconnection weights, then summed and processed by the second hidden higher layer. Finally the data is multiplied by interconnection weights then processed one last time within the output later to produce the neural network output.

ACTUAL NETWORK PART 3 There were roughly 5,647 data points. The addition of a second hidden layer allows more connections. This improved correlation of input combinations and allowed the network to be trained easier. The output value is the trained value, the number we are training to..008 was the max error chosen. This means that the permit table error for the trained values is.08%, yet almost all flare values we were training were near zero for normalization. Improvements will be considered later on.

TEST Small snippet of the data shows how immense The calculations are when actually running the Network through a simulation. The connection weights are also modified every time the network doesn t show the desired behavior.

LIMITATIONS? In reference to back-propagational networks there are some specific issues to be aware of: These neural networks are sometimes seen as black boxes. Aside from defining general architecture of the network and initially seeding it with random sets of data, the user has no other role than to feed it input and to watch it train and await the output. Many online references state in relation to back-propogation that, you almost don t know what you re doing. The learning itself progresses on it s own. The final product of this process is a trained network that provides no equations or coefficients or pretty plots defining a relationship beyond its own internal mathematics. The network IS the final equation in the system.

WAIT, MORE COMPLICATIONS?? Back-propogational networks also tend to be slower than other types of networks and can often require thousands of steps. If being run on a truly parallel computing system (ie. supercomputers that may run many calculations simultaneously) then running the network is not really a problem. However, when using standard serial machines (Macs and PCS) training may take quite a while. This is due to the fact that the machines CPU must complete each function of each node and connection separately, which can be problematic in massive networks with an abundance of data. This can be a problem if we want to feed the network every minute or so to predict flares.

HOPE IS NOT LOST Artificial Neural Networks can provide an analytical alternative to standard techniques which are usually inhibited by strict assumptions of normality, linearity, and variable dependence. Because ANN can take in many kinds of relationships it allows user to model, relatively easily, phenomena which otherwise may have been very difficult or impossible to explain otherwise. Can be used for: character recognition, breast cancer categorization, stock market prediction, etc

CONCLUSIONS The Artificial Network constructed is a starting step towards a much larger project. Training methods may need to be further optimized to more accurately filter out data that not be useful to the network. We could possibly utilize image processing to characterize active regions, which can reveal concentration of magnetic activity. As a result, the network may be able to perform local forecasting about the possibilities of an individual or group of regions producing large flares. A larger training set may be more ideal Other factors, such as temperature, spectral range, and magneto-grams may be used to further the parameters that characterize a solar flare. These may assist in the accuracy and efficiency of predicting flares.

SOURCES http://www.swpc.noaa.gov/swn/index.html http://hesperia.gsfc.nasa.gov/sftheory/flare.htm http://en.wikipedia.org/wiki/solar_flare http://lasp.colorado.edu/home/eve/science/instrument/ http://en.wikipedia.org/wiki/artificial_neural_network http://en.wikipedia.org/wiki/solar_cycle http://www.astropa.unipa.it/~orlando/intro_sun/slide17.html http://pages.cs.wisc.edu/~bolo/shipyard/neural/local.html http://www.ibm.com/developerworks/library/l-neural/ Neuroph.sourceforge.net http://www.nd.com/neurosolutions/products/ns/whatisnn.html