Neural Networks in Statistica

Similar documents
COMPUTATIONAL INTELLIGENCE

CS 4510/9010 Applied Machine Learning. Neural Nets. Paula Matuszek Fall copyright Paula Matuszek 2016

Administrative. Assignment 1 due Wednesday April 18, 11:59pm

MATLAB representation of neural network Outline Neural network with single-layer of neurons. Neural network with multiple-layer of neurons.

LECTURE NOTES Professor Anita Wasilewska NEURAL NETWORKS

Backpropagation and Neural Networks. Lecture 4-1

Neural Networks. Neural Network. Neural Network. Neural Network 2/21/2008. Andrew Kusiak. Intelligent Systems Laboratory Seamans Center

Predict the box office of US movies

Data Mining: STATISTICA

CS6220: DATA MINING TECHNIQUES

1. Approximation and Prediction Problems

Neural Network Weight Selection Using Genetic Algorithms

Outline. Prepare the data Classification and regression Clustering Association rules Graphic user interface

Solar Radiation Data Modeling with a Novel Surface Fitting Approach

Data Mining. Neural Networks

CMU Lecture 18: Deep learning and Vision: Convolutional neural networks. Teacher: Gianni A. Di Caro

Introduction to Automated Text Analysis. bit.ly/poir599

CS229 Final Project: Predicting Expected Response Times

ImageNet Classification with Deep Convolutional Neural Networks

Linear Models. Lecture Outline: Numeric Prediction: Linear Regression. Linear Classification. The Perceptron. Support Vector Machines

Colorado School of Mines. Computer Vision. Professor William Hoff Dept of Electrical Engineering &Computer Science.

COMP 551 Applied Machine Learning Lecture 14: Neural Networks

Distribution-free Predictive Approaches

Dynamic Analysis of Structures Using Neural Networks

PROGRAM LIST. 1* Backpropagation Neuro-Control*1 #include <stdio.h> #include <math.h> #include <string.h> #include <stdlib.h>

Ensemble methods in machine learning. Example. Neural networks. Neural networks

An Algorithm For Training Multilayer Perceptron (MLP) For Image Reconstruction Using Neural Network Without Overfitting.

Perceptrons and Backpropagation. Fabio Zachert Cognitive Modelling WiSe 2014/15

Classification Lecture Notes cse352. Neural Networks. Professor Anita Wasilewska

S. Ashok kumar Sr. Lecturer, Dept of CA Sasurie College of Engineering Vijayamangalam, Tirupur (Dt), Tamil Nadu, India

Convolutional Neural Networks for Object Classication in CUDA

The Automation of the Feature Selection Process. Ronen Meiri & Jacob Zahavi

Artificial Neural Networks MLP, RBF & GMDH

Recitation Supplement: Creating a Neural Network for Classification SAS EM December 2, 2002

SUPERVISED LEARNING METHODS. Stanley Liang, PhD Candidate, Lassonde School of Engineering, York University Helix Science Engagement Programs 2018

Neuro-Fuzzy Computing

Liquefaction Analysis in 3D based on Neural Network Algorithm

Neural Networks. CE-725: Statistical Pattern Recognition Sharif University of Technology Spring Soleymani

Opening the Black Box Data Driven Visualizaion of Neural N

Week 3: Perceptron and Multi-layer Perceptron

1 The Options and Structures in the Neural Net

Channel Performance Improvement through FF and RBF Neural Network based Equalization

Neural Networks Based Time-Delay Estimation using DCT Coefficients

ECE 285 Final Project

EE 511 Neural Networks

Neural Networks CMSC475/675

SELECTION OF A MULTIVARIATE CALIBRATION METHOD

Neural Networks. Robot Image Credit: Viktoriya Sukhanova 123RF.com

Logical Rhythm - Class 3. August 27, 2018

Multi-layer Perceptron Forward Pass Backpropagation. Lecture 11: Aykut Erdem November 2016 Hacettepe University

NNIGnets, Neural Networks Software

Supervised Learning in Neural Networks (Part 2)

CHAPTER 8 COMPOUND CHARACTER RECOGNITION USING VARIOUS MODELS

Tutorial on Machine Learning Tools

Model Answers to The Next Pixel Prediction Task

For Monday. Read chapter 18, sections Homework:

Deep Learning. Vladimir Golkov Technical University of Munich Computer Vision Group

A Dendrogram. Bioinformatics (Lec 17)

Classification Algorithms for Determining Handwritten Digit

Perceptron as a graph

COMPUTATIONAL INTELLIGENCE

Use of Artificial Neural Networks to Investigate the Surface Roughness in CNC Milling Machine

International Journal of Scientific Research & Engineering Trends Volume 4, Issue 6, Nov-Dec-2018, ISSN (Online): X

Lecture : Neural net: initialization, activations, normalizations and other practical details Anne Solberg March 10, 2017

ADVANCED ANALYTICS USING SAS ENTERPRISE MINER RENS FEENSTRA

Lecture 2 Notes. Outline. Neural Networks. The Big Idea. Architecture. Instructors: Parth Shah, Riju Pahwa

INTRODUCTION TO DEEP LEARNING

CPSC 340: Machine Learning and Data Mining. Deep Learning Fall 2018

Calculation of Model of the Robot by Neural Network with Robot Joint Distinction

RIMT IET, Mandi Gobindgarh Abstract - In this paper, analysis the speed of sending message in Healthcare standard 7 with the use of back

Report: Privacy-Preserving Classification on Deep Neural Network

Neural Network Neurons

Machine Learning 13. week

Neural Networks and Deep Learning

Climate Precipitation Prediction by Neural Network

MEDICAL IMAGE COMPUTING (CAP 5937) LECTURE 19: Machine Learning in Medical Imaging (A Brief Introduction)

11/14/2010 Intelligent Systems and Soft Computing 1

Please write your initials at the top right of each page (e.g., write JS if you are Jonathan Shewchuk). Finish this by the end of your 3 hours.

Analytical model A structure and process for analyzing a dataset. For example, a decision tree is a model for the classification of a dataset.

House Price Estimation from Visual and Textual Features

ECE 5470 Classification, Machine Learning, and Neural Network Review

Predict the Likelihood of Responding to Direct Mail Campaign in Consumer Lending Industry

Natural Language Processing CS 6320 Lecture 6 Neural Language Models. Instructor: Sanda Harabagiu

Backpropagation + Deep Learning

Introduction to ANSYS DesignXplorer

Why MultiLayer Perceptron/Neural Network? Objective: Attributes:

Back propagation Algorithm:

The Mathematics Behind Neural Networks

CS 6501: Deep Learning for Computer Graphics. Training Neural Networks II. Connelly Barnes

Introduction to Support Vector Machines

Programming Exercise 3: Multi-class Classification and Neural Networks

Extensive research has been conducted, aimed at developing

Machine Learning. Deep Learning. Eric Xing (and Pengtao Xie) , Fall Lecture 8, October 6, Eric CMU,

Mini-project 2 CMPSCI 689 Spring 2015 Due: Tuesday, April 07, in class

New methodology for calculating flight parameters with neural network EGD method

Data Compression. The Encoder and PCA

Embedding based Regression in Nonlinear System

Programming Exercise 4: Neural Networks Learning

Neural Networks and Machine Learning Applied to Classification of Cancer. Sachin Govind, Advisor: Namrata Pandya, IMSA

Research Article International Journals of Advanced Research in Computer Science and Software Engineering ISSN: X (Volume-7, Issue-6)

Transcription:

http://usnet.us.edu.pl/uslugi-sieciowe/oprogramowanie-w-usk-usnet/oprogramowaniestatystyczne/ Neural Networks in Statistica Agnieszka Nowak - Brzezińska

The basic element of each neural network is neuron. x1 x2 w1 w2 Dendrites Terminal Branches of Axon x3 w3 S Axon xn wn

Types of neurons y S- aggregated input value Activation function

Activation function For linear neurons: Linear, sigmoidal, hiperbolic, exponential, sinusoidal, For radial: gauss. Linear is the aggregation. Output value can be taken from nonlinear activation function.

Neuron s learning y

Prediction Input: X 1 X 2 X 3 Output: Y Model: Y = f(x 1 X 2 X 3 ) X 1 =1 X 2 =-1 X 3 =2 0.2 = 0.5 * 1 0.1*(-1) 0.2 * 2 0.5 0.6-0.1 0.1-0.2 0.7 f(x) = e x / (1 + e x ) f(0.2) = e 0.2 / (1 + e 0.2 ) = 0.55 0.2 f (0.2) = 0.55 0.55 0.9 f (0.9) = 0.71 0.71 Prediction Y = 0.478 0.1-0.2-0.087 f (-0.087) = 0.478 0.478 If true id Y = 2 Then prediction error is = (2-0.478) =1.522

Learning process 2. Calculate the value of Y 1. Randomly choose one observation 3. Compare Y with the actual value 4. Modify the weights by calculation the error

Backpropagation It is one of the most popular techniques of learning process for NN.

How to calculate the prediction error? where: Error i is the error ofr i-th node, Output i is the predicted by the network, Actual i is the real value which should be predicted

Weights modification L- is the learning factor from the range [0,1] The less the l values is the slowest the learning process is. Very often l is the highest in the begining and then reducted with the changing of the weights.

Example

Zmiana wag L- is the learning factor from the range [0,1] The less the l values is the slowest the learning process is. Very often l is the highest in the begining and then reducted with the changing of the weights.

How many neurons? The number of neurons in the input layer depends on the number of input variables The number of neurons in output layer depends on the type of the problem to solve by the network The number of neurons in hidden layer depends on the users qualifications

Neural network tasks: clasification NN is to decide about the class of a given object (classes in nominal scale) regresion NN is to predict a value (numerical) of the attribute which is the output value.

Clasification 1. Dataset leukemia.sta 2. choose the type of NN

3. Choose the variables: 4. Automatic generation of NN

You may change the proportions of division of dataset in learning and testing probes

Automatic generation of NN Linears neurons(mlp) Minimal (3) maksimal (10) neurons in hidden layer 20 NN, 5 the best is displayed Error function: SSE The window presents the creation of model 3-6-2 where 3 are neurons in input layer, 6 in hidden and 2 in output layer.

3 best nets are saved

Predictions Graphs Details Liftcharts Custom predictions SUMMARY

Predictions

Details Summary Weights Confusion matrix

Details Ciekawe są opcje: Summary Weights Confusion matrix

Details Ciekawe są opcje: Summary Weights Confusion matrix

Results

Zakładka liftcharts Further read: http://www.statsoft.pl/czytelnia/artykuly/krzywe_roc_czyli_ocena_jakosci.pdf

1 dataset tomatoes.sta Regression 2. type of the network:

3. Choose from the variables: 4. Automatic generation of NN

2 best NN saved

Predictions Graphs Details Liftcharts Custom predictions SUMMARY

Predictions

Graphs

Details Ciekawe są opcje: Summary Weights Correlation coefficients Confusion matrix -

Results

read: http://zsi.tech.us.edu.pl/~nowak/si/si_w4.pdf