Gdansk University of Technology Faculty of Electrical and Control Engineering Department of Control Systems Engineering
|
|
- Annice Campbell
- 6 years ago
- Views:
Transcription
1 Gdansk University of Technology Faculty of Electrical and Control Engineering Department of Control Systems Engineering Artificial Intelligence Methods Neuron, neural layer, neural netorks - surface of the neural netork response Laboratory exercises no T1 Auxiliary material for laboratory exercises Authors: Anna Kobylarz, mgr inż. Kazimierz Duzinkieicz, dr hab. inż. Michał Grochoski, dr inż. Gdańsk, 2015
2 The material contains selected parts of the lecture and additional information in order to create artificial neural netorks structures in the MATLAB environment using the Neural Netork Toolbox. In particular, it contains a description of the basic elements of neuron, activation functions, a description of the main instructions available in the MATLAB s toolbox and an example of creating and learning perceptron neural netork. Artificial neural netorks - a single neuron A neural netork is characterized by: 1. Functions according to hich a neuron responds to the inputs, called excitation functions and activation functions; 2. The structure of the connections beteen neurons, called netork architecture; 3. The method of determining the eights of those connections, called the learning algorithm. p 1 Weights 1,1 Excitation (propagation) function Activation function Response signal Input signals p 2. p j. 1,2. 1,j. g() b s Threshold n s a s f() Output Excitation signal p R 1,R Fig. 1. Artificial neuron model scheme. Where: R inputs number, S number of neurons in a layer, Fig. 2 shos ho a neuron is presented in the MATLAB documentation along ith the accepted indications of inputs, eights, threshold and output. Fig. 2. Scheme of a neuron ith a single input. Symbols and notation from MATLAB. 2
3 For this case, the dimensions of the matrixes describing the neuron are as follos: In cases here multiple signals on the input are given, the neuron and its scheme is as follos: p p; W ; b b ; a a Fig. 3. Scheme of neuron ith multiple (R) inputs. Where: R number of inputs, Where: R number of inputs, Fig. 4. Scheme of neuron ith multiple (R) inputs. Symbols and notation from MATLAB. The dimensions of matrixes describing this neuron are as follos: p p p 1 R 2 p ; R W ; b b ; a a 1, 2, Activation (transfer) functions of the neuron 3
4 In the Figs 5-7 examples of activation functions are shon, respectively step, linear and sigmoidal logistic (log-sigmoid) function. All instructions available in the toolbox are given in Table 1. Fig. 5. Hard limit activation function. Where: p input signal to neuron, n excitation signal of neuron, a output signal from neuron, eight value of neuron s input and b threshold value. Fig. 6. Linear activation function. Fig. 7. Log-sigmoid activation function. 4
5 Table 1. Activation (transfer) functions available in MATLAB s Neural Netork Toolbox. In Fig. 8 an example of a neuron ith a symmetrical hard limit activation function (perceptron) and to inputs is shon. Fig. 8. Neuron ith symmetrical hard limit activation function and to inputs perceptron. 5
6 Matrixes dimmensions for this neuron are as follos: p1 p ; 1,1 1,2 b p2 Artificial neural netorks neural layer W ; b ; a a Figure 9 and 10 sho a single neural netork layer ith a description of its parameters. Fig. 9. Neural netork layer scheme. Where: R inputs number, S number of neurons in layer, Fig. 10. Neuron ith hard limit activation function perceptron. Symbols and notation from MATLAB. 6
7 7 Matrixes dimmensions for neural layer are as follos: R 2 1 p p p p S,R S,2 S,1 2,R 2,2 2,1 1,R 1,2 1,1 W S 2 1 b b b b S 2 1 a a a a Artificial neural netorks multi-layer netork Figure 11 and 12 sho a multi-layer (three-layer) feedforard neural netork. As can be seen, the outputs of each layer are the inputs of another of the layers. Where: R number of inputs, s 1 - number of neurons in the first layer, s 2 - number of neurons in the second layer, s 3 - number of neurons in the third layer. Fig. 11. Scheme of multi-layer feedforard neural netork.
8 Where: R number of inputs, s 1 - number of neurons in the first layer, s 2 - number of neurons in the second layer, s 3 - number of neurons in the third layer. Fig. 12. Scheme of multi-layer feedforard neural netork. Symbols and notation from MATLAB. 8
9 Basic commands of MATLAB Neural Netork Toolbox ver. 8.2 Belo in Table 2. the important commands from the MATLAB s Neural Netork are shon. More commands can be obtained by typing help nnet, hile the details of a specific command, such as: the use of syntax, the algorithms, the application s example etc., can be obtained by typing help command ith any instruction, e.g. help nep. Table 2. MATLAB s Neural Netork Toolbox ver 8.2 commands. Command Short description Creating a netork netork Create a custom neural netork. nec Create a competitive layer. necf Create a cascade-forard backpropagation netork. neff Create a feed-forard backpropagation netork. nefftd Create a feed-forard input time-delay backpropagation netork. nelin Create a linear layer. nelind Design a linear layer. nep Create a perceptron. Net input (excitation) functions netprod Product net input function netsum Sum net input function Functions initializing netork parameters initlay Layer-by-layer netork initialization function Functions describing quality of netork s ork mae Mean absolute error performance function mse Mean squared normalized error performance function msereg Mean squared error ith regularization performance function. sse Sum squared error performance function Learning methods learncon Conscience bias learning function learngd Gradient descent eight and bias learning function learngdm Gradient descent ith momentum eight and bias learning function learnis Instar eight learning function learnlv1 LVQ1 eight learning function learnlv2 LVQ2 eight learning function learnos Outstar eight learning function learnp Perceptron eight and bias learning function learnpn Normalized perceptron eight and bias learning function learnh Widro-Hoff eight/bias learning function Processing of input and output data prestd Preprocesses the data so that the mean is 0 and the standard deviation is 1. poststd Postprocesses data hich has been preprocessed by PRESTD. trastd Preprocesses data using a precalculated mean and standard deviation. premnmx Preprocesses data so that minimum is -1 and maximum is 1. postmnmx Postprocesses data hich has been preprocessed by PREMNMX. tramnmx Transform data using a precalculated min and max. prepca Principal component analysis. trapca Principal component transformation. postreg Postprocesses the trained netork response ith a linear regression. 9
10 trainb trainbfg trainbr trainc traincgb traincgf traincgp traingd traingdm traingda traingdx trainlm trainoss trainr trainrp trainscg compet hardlim hardlims logsig poslin purelin radbas satlin satlins tansig errsurf maxlinlr initn initb cell2mat concur con2seq combvec mat2cell minmax nncopy normc normr seq2con sumsqr sim init adapt train Learning methods Batch training ith eight & bias learning rules. BFGS quasi-neton backpropagation. Bayesian Regulation backpropagation. Cyclical order eight/bias training. Conjugate gradient backpropagation ith Poell-Beale restarts. Conjugate gradient backpropagation ith Fletcher-Reeves updates. Conjugate gradient backpropagation ith Polak-Ribiere updates. Gradient descent backpropagation. Gradient descent ith momentum. Gradient descent ith adaptive learning rate backpropagation. Gradient descent /momentum & adaptive learning rate backpropagation. Levenberg-Marquardt backpropagation. One step secant backpropagation. Random order eight/bias training. RPROP backpropagation. Scaled conjugate gradient backpropagation. Activation functions Competitive transfer function Hard-limit transfer function Symmetric hard-limit transfer function Log-sigmoid transfer function Positive linear transfer function Linear transfer function Radial basis transfer function Saturating linear transfer function Symmetric saturating linear transfer function Hyperbolic tangent sigmoid transfer function Functions alloing easier analysis Error surface of single-input neuron Maximum learning rate for linear layer Functions initializing layer s parameters Nguyen-Widro layer initialization function By eight and bias layer initialization function Operations on vectors Convert cell array to numeric array Create concurrent bias vectors Convert concurrent vectors to sequential vectors Create all combinations of vectors Convert array to cell array ith potentially different sized cells Ranges of matrix ros Copy matrix or cell array Normalize columns of matrix Normalize ros of matrix Convert sequential vectors to concurrent vectors Sum of squared elements of matrix or matrices Operations on netorks Simulate dynamic system (neural netork) Initialize neural netork Adapt neural netork to data as it is simulated Train neural netork 10
11 hinton hintonb plotes plotpc plotpv plotep plotperf plotv plotvec nntool gensim Plotting Hinton graph of eight matrix. Hinton graph of eight matrix and bias vector. Plot error surface of single-input neuron Plot classification line on perceptron vector plot Plot perceptron input/target vectors Plot eight-bias position on error surface Plot netork performance. Plot vectors as lines from origin Plot vectors ith different colors Others Neural Netork Toolbox graphical user interface Generate a Simulink block to simulate a neural netork. 11
12 An example of creating and learning a perceptron neural netork. Belo are the basic commands used to create an example perceptron neural netork. The process of creating other netorks is similar, only the parameters and some instructions are different. It is possible to create a perceptron netork by using the nep command. Syntax: net = nep(pr,s,tf,lf) The parameters of this command are: PR RxQ matrix of Q1 representative input vectors. S - SxQ matrix of Q2 representative target vectors. TF Transfer (activation) function, default = 'hardlim'. LF Learning function, default = 'learnp'. It is not necessary to give the TF and LF parameters. The result of this command is creating a perceptron netork. Under the name net a structure is created. Within this structure all the information about this netork are stored. Netork initialization nep command also initializes (gives) the initial (zero) values of eights and thresholds. Weights connecting the inputs to the netork are stored in the structure net.iw. The thresholds are stored in the netork structure net.b. It is also possible to choose values of these elements independently, e.g.: net.iw{1,1}=[2 4] net.b{1}=[5] To restore the default eights and thresholds (initialization of the netork parameters) a command e.g. init should be used. net = init(net); Netork simulation To study the netork s response for an input vector a sim command should be used. The syntax of this command is available after typing help sim in the MATLAB s command indo. Mostly it is sufficient to specify hat netork is going to be used and to indicate input vector, in this case P: 12
13 Y = sim(net,p); It simulates response of netork net for input P. Defining input vector P as: P = [ ; ]; And target vector T: T = [ ]; The result is netork s response: Y = hich is not compliant ith expectations (target T). To achieve anted output values (accordingly to the target) a proper selection of eights and bias values is necessary, hich means that the netork has to learn ho to respond properly. Example Design perceptron netork consisting of a single neuron ith to inputs. The first input should be ithin the range [0; 1] and the second input ithin the range [-2; 2]. Save the proposed structure under the name net. In response to a predefined vector P, the netork should respond in accordance ith the vector T. net = nep([0 1; -2 2],1); Response for vector P: Y = sim(net,p) Y = Netork learning As e can see the netork s response is different from the vector T. In order to achieve the correct response a change of the eights and biases values is needed, either manually or in the learning process. Command train is used to start learning process. This statement evaluates the netork s default learning method (modification of eights and thresholds) in order to fit the response into the vector T. More options for train command can be obtained by typing help train. 13
14 net = train(net,p,t); Response of the netork after learning process is shon belo. Y = sim(net,p) Y = As it can be seen netork responded properly. It completes the example. It is possible to check ith hat values of eights and biases the task ends: net.iw{1,1} net.iw{1,1} =[1 1] net.b{1} =[-1] Changing the default parameters of netork After creating any netork it has various default parameters such as: learning method, the initialization method, the method of testing the quality of the netork, the netork learning process and many others. To check them typing the name of the created netork in MATLAB s command indo is needed. These parameters can also be changed ith usage of appropriate commands. Belo is some basic ones (for the perceptron netork created by nep command). Functions: adaptfcn: 'trains' initfcn: 'initlay' performfcn: 'mae' trainfcn: 'trainc' Parameters: adaptparam:.passes initparam: (none) performparam: (none) trainparam:.epochs,.goal,.sho,.time Changing default values of netork parameters (example values): net.trainparam.sho = 50; net.trainparam.lr = 0.05; net.trainparam.epochs = 300; net.trainparam.goal = 1e-5; 14
A Study of Various Training Algorithms on Neural Network for Angle based Triangular Problem
A Study of Various Training Algorithms on Neural Network for Angle based Triangular Problem Amarpal Singh M.Tech (CS&E) Amity University Noida, India Piyush Saxena M.Tech (CS&E) Amity University Noida,
More informationCONTROLO E DECISÃO INTELIGENTE 08/09
CONTROLO E DECISÃO INTELIGENTE 08/09 PL #5 MatLab Neural Networks Toolbox Alexandra Moutinho Example #1 Create a feedforward backpropagation network with a hidden layer. Here is a problem consisting of
More informationPlanar Robot Arm Performance: Analysis with Feedforward Neural Networks
Planar Robot Arm Performance: Analysis with Feedforward Neural Networks Abraham Antonio López Villarreal, Samuel González-López, Luis Arturo Medina Muñoz Technological Institute of Nogales Sonora Mexico
More informationStatistical & Data Analysis Using Neural Network
Statistical & Data Analysis Using Neural TechSource Systems Sdn. Bhd. Course Outline:. Neural Concepts a) Introduction b) Simple neuron model c) MATLAB representation of neural network 2. a) Perceptrons
More informationThe Pennsylvania State University. The Graduate School. John and Willie Leone Family Department of Energy and Mineral Engineering
The Pennsylvania State University The Graduate School John and Willie Leone Family Department of Energy and Mineral Engineering RATE TRANSIENT ANALYSIS OF DUAL LATERAL WELLS IN NATURALLY FRACTURED RESERVOIRS
More informationNeuro-Fuzzy Computing
CSE531 Neuro-Fuzzy Computing Tutorial/Assignment 2: Adaline and Multilayer Perceptron About this tutorial The objective of this tutorial is to study: You can create a single (composite) layer of neurons
More informationANN Based Short Term Load Forecasting Paradigms for WAPDA Pakistan
Australian Journal of Basic and Applied Sciences, 4(5): 932-947, 2010 ISSN 1991-8178 ANN Based Short Term Load Forecasting Paradigms for WAPDA Pakistan Laiq Khan, Kamran Javed, Sidra Mumtaz Department
More informationA neural network that classifies glass either as window or non-window depending on the glass chemistry.
A neural network that classifies glass either as window or non-window depending on the glass chemistry. Djaber Maouche Department of Electrical Electronic Engineering Cukurova University Adana, Turkey
More informationNeural Networks. Lab 3: Multi layer perceptrons. Nonlinear regression and prediction.
Neural Networks. Lab 3: Multi layer perceptrons. Nonlinear regression and prediction. 1. Defining multi layer perceptrons. A multi layer perceptron (i.e. feedforward neural networks with hidden layers)
More informationThe Pennsylvania State University. The Graduate School DEVELOPMENT OF AN ARTIFICIAL NEURAL NETWORK FOR DUAL LATERAL HORIZONTAL WELLS IN GAS RESERVOIRS
The Pennsylvania State University The Graduate School John and Willie Leone Family Department of Energy and Mineral Engineering DEVELOPMENT OF AN ARTIFICIAL NEURAL NETWORK FOR DUAL LATERAL HORIZONTAL WELLS
More information1. Approximation and Prediction Problems
Neural and Evolutionary Computing Lab 2: Neural Networks for Approximation and Prediction 1. Approximation and Prediction Problems Aim: extract from data a model which describes either the depence between
More information1 The Options and Structures in the Neural Net
1 The Options and Structures in the Neural Net These notes are broken into several parts to give you one place to look for the most salient features of a neural network. 1.1 Initialize the Neural Network
More informationNeural Network Toolbox User's Guide
Neural Network Toolbox User's Guide Mark Hudson Beale Martin T. Hagan Howard B. Demuth R2015b How to Contact MathWorks Latest news: www.mathworks.com Sales and services: www.mathworks.com/sales_and_services
More informationNeural Network Toolbox User s Guide
Neural Network Toolbox User s Guide R2011b Mark Hudson Beale Martin T. Hagan Howard B. Demuth How to Contact MathWorks www.mathworks.com Web comp.soft-sys.matlab Newsgroup www.mathworks.com/contact_ts.html
More informationCHAPTER VI BACK PROPAGATION ALGORITHM
6.1 Introduction CHAPTER VI BACK PROPAGATION ALGORITHM In the previous chapter, we analysed that multiple layer perceptrons are effectively applied to handle tricky problems if trained with a vastly accepted
More informationA NEURAL NETWORK BASED IRIS RECOGNITION SYSTEM FOR PERSONAL IDENTIFICATION
A NEURAL NETWORK BASED IRIS RECOGNITION SYSTEM FOR PERSONAL IDENTIFICATION Usham Dias 1, Vinita Frietas 2, Sandeep P.S. 3 and Amanda Fernandes 4 Department of Electronics and Telecommunication, Padre Conceicao
More informationAn Evaluation of Statistical Models for Programmatic TV Bid Clearance Predictions
Lappeenranta University of Technology School of Business and Management Degree Program in Computer Science Shaghayegh Royaee An Evaluation of Statistical Models for Programmatic TV Bid Clearance Predictions
More informationMATLAB representation of neural network Outline Neural network with single-layer of neurons. Neural network with multiple-layer of neurons.
MATLAB representation of neural network Outline Neural network with single-layer of neurons. Neural network with multiple-layer of neurons. Introduction: Neural Network topologies (Typical Architectures)
More informationComparison and Evaluation of Artificial Neural Network (ANN) Training Algorithms in Predicting Soil Type Classification
Bulletin of Environment, Pharmacology and Life Sciences Bull. Env.Pharmacol. Life Sci., Vol [Spl issue ] 0: -8 0 Academy for Environment and Life Sciences, India Online ISSN -808 Journal s URL:http://www.bepls.com
More informationANN training the analysis of the selected procedures in Matlab environment
ANN training the analysis of the selected procedures in Matlab environment Jacek Bartman, Zbigniew Gomółka, Bogusław Twaróg University of Rzeszow, Department of Computer Engineering, 35-310 Rzeszow, Pigonia
More informationNeural Networks Laboratory EE 329 A
Neural Networks Laboratory EE 329 A Introduction: Artificial Neural Networks (ANN) are widely used to approximate complex systems that are difficult to model using conventional modeling techniques such
More informationAn Edge Detection Method Using Back Propagation Neural Network
RESEARCH ARTICLE OPEN ACCESS An Edge Detection Method Using Bac Propagation Neural Netor Ms. Utarsha Kale*, Dr. S. M. Deoar** *Department of Electronics and Telecommunication, Sinhgad Institute of Technology
More informationPERFORMANCE ANALYSIS AND VALIDATION OF CLUSTERING ALGORITHMS USING SOFT COMPUTING TECHNIQUES
PERFORMANCE ANALYSIS AND VALIDATION OF CLUSTERING ALGORITHMS USING SOFT COMPUTING TECHNIQUES Bondu Venkateswarlu Research Scholar, Department of CS&SE, Andhra University College of Engineering, A.U, Visakhapatnam,
More informationArtificial Neural Network (ANN) Approach for Predicting Friction Coefficient of Roller Burnishing AL6061
International Journal of Machine Learning and Computing, Vol. 2, No. 6, December 2012 Artificial Neural Network (ANN) Approach for Predicting Friction Coefficient of Roller Burnishing AL6061 S. H. Tang,
More informationMODELLING OF ARTIFICIAL NEURAL NETWORK CONTROLLER FOR ELECTRIC DRIVE WITH LINEAR TORQUE LOAD FUNCTION
MODELLING OF ARTIFICIAL NEURAL NETWORK CONTROLLER FOR ELECTRIC DRIVE WITH LINEAR TORQUE LOAD FUNCTION Janis Greivulis, Anatoly Levchenkov, Mikhail Gorobetz Riga Technical University, Faculty of Electrical
More informationNeural Nets. General Model Building
Neural Nets To give you an idea of how new this material is, let s do a little history lesson. The origins of neural nets are typically dated back to the early 1940 s and work by two physiologists, McCulloch
More informationApplication of a Back-Propagation Artificial Neural Network to Regional Grid-Based Geoid Model Generation Using GPS and Leveling Data
Application of a Back-Propagation Artificial Neural Network to Regional Grid-Based Geoid Model Generation Using GPS and Leveling Data Lao-Sheng Lin 1 Abstract: The height difference between the ellipsoidal
More informationOPTIMIZED NEURAL NETWORK MODEL FOR A POTATO STORAGE SYSTEM
OPTIMIZED NEURAL NETWORK MODEL FOR A POTATO STORAGE SYSTEM Adeyinka Abdulquadri Oluwo, Raisuddin Md Khan, Momoh J. E. Salami and Marwan A. Badran Department of Mechatronics Engineering, Kulliyyah of Engineering,
More informationA Framework of Hyperspectral Image Compression using Neural Networks
A Framework of Hyperspectral Image Compression using Neural Networks Yahya M. Masalmah, Ph.D 1, Christian Martínez-Nieves 1, Rafael Rivera-Soto 1, Carlos Velez 1, and Jenipher Gonzalez 1 1 Universidad
More informationCS6220: DATA MINING TECHNIQUES
CS6220: DATA MINING TECHNIQUES Image Data: Classification via Neural Networks Instructor: Yizhou Sun yzsun@ccs.neu.edu November 19, 2015 Methods to Learn Classification Clustering Frequent Pattern Mining
More informationUsing the NNET Toolbox
CS 333 Neural Networks Spring Quarter 2002-2003 Dr. Asim Karim Basics of the Neural Networks Toolbox 4.0.1 MATLAB 6.1 includes in its collection of toolboxes a comprehensive API for developing neural networks.
More informationDept. of Computing Science & Math
Lecture 4: Multi-Laer Perceptrons 1 Revie of Gradient Descent Learning 1. The purpose of neural netor training is to minimize the output errors on a particular set of training data b adusting the netor
More informationNN-GVEIN: Neural Network-Based Modeling of Velocity Profile inside Graft-To-Vein Connection
Proceedings of 5th International Symposium on Intelligent Manufacturing Systems, Ma1y 29-31, 2006: 854-862 Sakarya University, Department of Industrial Engineering1 NN-GVEIN: Neural Network-Based Modeling
More informationALGORITHMS FOR INITIALIZATION OF NEURAL NETWORK WEIGHTS
ALGORITHMS FOR INITIALIZATION OF NEURAL NETWORK WEIGHTS A. Pavelka and A. Procházka Institute of Chemical Technology, Department of Computing and Control Engineering Abstract The paper is devoted to the
More informationGenerating high frequency trading strategies with articial neural networks and testing them through simulation
Generating high frequency trading strategies with articial neural networks and testing them through simulation September 24, 21 A comparison of second order training methods for time series modelling Nicolas
More informationGrowing and Learning Algorithms of Radial Basis Function Networks by. Tiantian Xie
Groing and Learning Algorithms of Radial Basis Function Netorks by Tiantian Xie A dissertation submitted to the Graduate Faculty of Auburn University in partial fulfillment of the requirements for the
More informationFACE DETECTION AND RECOGNITION USING BACK PROPAGATION NEURAL NETWORK (BPNN)
FACE DETECTION AND RECOGNITION USING BACK PROPAGATION NEURAL NETWORK (BPNN) *1Ms. Vijayalakshmi. T, * 2 Mrs. Ganga T. K *1M.phil Research Scholar, Department of computer Science Muthurangam Government
More informationSupervised Learning in Neural Networks (Part 2)
Supervised Learning in Neural Networks (Part 2) Multilayer neural networks (back-propagation training algorithm) The input signals are propagated in a forward direction on a layer-bylayer basis. Learning
More informationWeek 3: Perceptron and Multi-layer Perceptron
Week 3: Perceptron and Multi-layer Perceptron Phong Le, Willem Zuidema November 12, 2013 Last week we studied two famous biological neuron models, Fitzhugh-Nagumo model and Izhikevich model. This week,
More informationRole of Hidden Neurons in an Elman Recurrent Neural Network in Classification of Cavitation Signals
Role of Hidden Neurons in an Elman Recurrent Neural Network in Classification of Cavitation Signals Ramadevi R Sathyabama University, Jeppiaar Nagar, Rajiv Gandhi Road, Chennai 600119, India Sheela Rani
More informationApplication of neural networks to model catamaran type powerboats
Application of eural etors to model Catamaran Type Poerboats Application of neural netors to model catamaran type poerboats Garron Fish Mie Dempsey Claytex Services Ltd Edmund House, Rugby Road, Leamington
More informationInternational Journal of Electrical and Computer Engineering 4: Application of Neural Network in User Authentication for Smart Home System
Application of Neural Network in User Authentication for Smart Home System A. Joseph, D.B.L. Bong, and D.A.A. Mat Abstract Security has been an important issue and concern in the smart home systems. Smart
More informationAn Algorithm For Training Multilayer Perceptron (MLP) For Image Reconstruction Using Neural Network Without Overfitting.
An Algorithm For Training Multilayer Perceptron (MLP) For Image Reconstruction Using Neural Network Without Overfitting. Mohammad Mahmudul Alam Mia, Shovasis Kumar Biswas, Monalisa Chowdhury Urmi, Abubakar
More informationPerformance Evaluation of Artificial Neural Networks for Spatial Data Analysis
Performance Evaluation of Artificial Neural Networks for Spatial Data Analysis Akram A. Moustafa 1*, Ziad A. Alqadi 2 and Eyad A. Shahroury 3 1 Department of Computer Science Al Al-Bayt University P.O.
More informationAPPLICATION OF A MULTI- LAYER PERCEPTRON FOR MASS VALUATION OF REAL ESTATES
FIG WORKING WEEK 2008 APPLICATION OF A MULTI- LAYER PERCEPTRON FOR MASS VALUATION OF REAL ESTATES Tomasz BUDZYŃSKI, PhD Artificial neural networks the highly sophisticated modelling technique, which allows
More informationPERFORMANCE COMPARISON OF BACK PROPAGATION AND RADIAL BASIS FUNCTION WITH MOVING AVERAGE FILTERING AND WAVELET DENOISING ON FETAL ECG EXTRACTION
I J C T A, 9(28) 2016, pp. 431-437 International Science Press PERFORMANCE COMPARISON OF BACK PROPAGATION AND RADIAL BASIS FUNCTION WITH MOVING AVERAGE FILTERING AND WAVELET DENOISING ON FETAL ECG EXTRACTION
More informationThe Use of Artificial Neural Networks in Predicting Vertical Displacements of Structures
International Journal of Applied Science and Technology Vol. 3 No. 5; May 2013 The Use of Artificial Neural Networks in Predicting Vertical Displacements of Structures Abstract George Pantazis Assistant
More informationIn this assignment, we investigated the use of neural networks for supervised classification
Paul Couchman Fabien Imbault Ronan Tigreat Gorka Urchegui Tellechea Classification assignment (group 6) Image processing MSc Embedded Systems March 2003 Classification includes a broad range of decision-theoric
More informationEarly tube leak detection system for steam boiler at KEV power plant
Early tube leak detection system for steam boiler at KEV power plant Firas B. Ismail 1a,, Deshvin Singh 1, N. Maisurah 1 and Abu Bakar B. Musa 1 1 Power Generation Research Centre, College of Engineering,
More informationInternational Research Journal of Computer Science (IRJCS) ISSN: Issue 09, Volume 4 (September 2017)
APPLICATION OF LRN AND BPNN USING TEMPORAL BACKPROPAGATION LEARNING FOR PREDICTION OF DISPLACEMENT Talvinder Singh, Munish Kumar C-DAC, Noida, India talvinder.grewaal@gmail.com,munishkumar@cdac.in Manuscript
More informationThe AMORE Package. July 27, 2006
The AMORE Package July 27, 2006 Version 0.2-9 Date 2006-07-27 Title A MORE flexible neural network package Author Manuel CastejÃşn Limas, Joaquà n B. Ordieres MerÃl,Eliseo P. Vergara GonzÃąlez, Francisco
More informationCHAPTER 5 NEURAL NETWORK BASED CLASSIFICATION OF ELECTROGASTROGRAM SIGNALS
113 CHAPTER 5 NEURAL NETWORK BASED CLASSIFICATION OF ELECTROGASTROGRAM SIGNALS 5.1 INTRODUCTION In today s computing world, Neural Networks (NNs) fascinate the attention of the multi-disciplinarians like
More informationNeural Networks. CE-725: Statistical Pattern Recognition Sharif University of Technology Spring Soleymani
Neural Networks CE-725: Statistical Pattern Recognition Sharif University of Technology Spring 2013 Soleymani Outline Biological and artificial neural networks Feed-forward neural networks Single layer
More informationCHAPTER 7 MASS LOSS PREDICTION USING ARTIFICIAL NEURAL NETWORK (ANN)
128 CHAPTER 7 MASS LOSS PREDICTION USING ARTIFICIAL NEURAL NETWORK (ANN) Various mathematical techniques like regression analysis and software tools have helped to develop a model using equation, which
More informationImplementation of SDES and DES Using Neural Network
Implementation of SDES and DES Using Neural Network Sojwal S. Kulkarni alias R.M. Jogdand Associate Professor Department of Computer Science & Engineering GIT, Belgaum Dr. H.M.Rai Ex-Professor- National
More informationNeural Network Neurons
Neural Networks Neural Network Neurons 1 Receives n inputs (plus a bias term) Multiplies each input by its weight Applies activation function to the sum of results Outputs result Activation Functions Given
More informationINVESTIGATING DATA MINING BY ARTIFICIAL NEURAL NETWORK: A CASE OF REAL ESTATE PROPERTY EVALUATION
http:// INVESTIGATING DATA MINING BY ARTIFICIAL NEURAL NETWORK: A CASE OF REAL ESTATE PROPERTY EVALUATION 1 Rajat Pradhan, 2 Satish Kumar 1,2 Dept. of Electronics & Communication Engineering, A.S.E.T.,
More informationEfficient training algorithms for a class of shunting inhibitory convolutional neural networks
University of Wollongong Research Online Faculty of Informatics - Papers (Archive) Faculty of Engineering and Information Sciences 2005 Efficient training algorithms for a class of shunting inhibitory
More informationNEURAL NETWORK FOR PLC
NEURAL NETWORK FOR PLC L. Körösi, J. Paulusová Institute of Robotics and Cybernetics, Slovak University of Technology, Faculty of Electrical Engineering and Information Technology Abstract The aim of the
More informationModeling of smartphones power using neural networks
Alawnah and Sagahyroon EURASIP Journal on Embedded Systems (2017) 2017:22 DOI 10.1186/s13639-017-0070-1 EURASIP Journal on Embedded Systems RESEARCH Modeling of smartphones power using neural networks
More informationPerceptrons and Backpropagation. Fabio Zachert Cognitive Modelling WiSe 2014/15
Perceptrons and Backpropagation Fabio Zachert Cognitive Modelling WiSe 2014/15 Content History Mathematical View of Perceptrons Network Structures Gradient Descent Backpropagation (Single-Layer-, Multilayer-Networks)
More informationECE 662 Hw2 4/1/2008
ECE 662 Hw2 4/1/28 1. Fisher s Linear Discriminant Analysis n 1 We would like to find a projection from R to R, such that we can maximize the separation between 2 classes. We define the cost function,
More informationLinear models. Subhransu Maji. CMPSCI 689: Machine Learning. 24 February February 2015
Linear models Subhransu Maji CMPSCI 689: Machine Learning 24 February 2015 26 February 2015 Overvie Linear models Perceptron: model and learning algorithm combined as one Is there a better ay to learn
More informationClassical Gradient Methods
Classical Gradient Methods Note simultaneous course at AMSI (math) summer school: Nonlin. Optimization Methods (see http://wwwmaths.anu.edu.au/events/amsiss05/) Recommended textbook (Springer Verlag, 1999):
More informationProceedings of the 2016 International Conference on Industrial Engineering and Operations Management Detroit, Michigan, USA, September 23-25, 2016
Neural Network Viscosity Models for Multi-Component Liquid Mixtures Adel Elneihoum, Hesham Alhumade, Ibrahim Alhajri, Walid El Garwi, Ali Elkamel Department of Chemical Engineering, University of Waterloo
More informationImage Compression: An Artificial Neural Network Approach
Image Compression: An Artificial Neural Network Approach Anjana B 1, Mrs Shreeja R 2 1 Department of Computer Science and Engineering, Calicut University, Kuttippuram 2 Department of Computer Science and
More informationResearch on Evaluation Method of Product Style Semantics Based on Neural Network
Research Journal of Applied Sciences, Engineering and Technology 6(23): 4330-4335, 2013 ISSN: 2040-7459; e-issn: 2040-7467 Maxwell Scientific Organization, 2013 Submitted: September 28, 2012 Accepted:
More informationWatermarking Using Bit Plane Complexity Segmentation and Artificial Neural Network Rashmeet Kaur Chawla 1, Sunil Kumar Muttoo 2
International Journal of Scientific Research and Management (IJSRM) Volume 5 Issue 06 Pages 5378-5385 2017 Website: www.ijsrm.in ISSN (e): 2321-3418 Index Copernicus value (2015): 57.47 DOI: 10.18535/ijsrm/v5i6.04
More informationLECTURE NOTES Professor Anita Wasilewska NEURAL NETWORKS
LECTURE NOTES Professor Anita Wasilewska NEURAL NETWORKS Neural Networks Classifier Introduction INPUT: classification data, i.e. it contains an classification (class) attribute. WE also say that the class
More informationFeed Forward Neural Network for Solid Waste Image Classification
Research Journal of Applied Sciences, Engineering and Technology 5(4): 1466-1470, 2013 ISSN: 2040-7459; e-issn: 2040-7467 Maxwell Scientific Organization, 2013 Submitted: June 29, 2012 Accepted: August
More informationRecapitulation on Transformations in Neural Network Back Propagation Algorithm
International Journal of Information and Computation Technology. ISSN 0974-2239 Volume 3, Number 4 (2013), pp. 323-328 International Research Publications House http://www. irphouse.com /ijict.htm Recapitulation
More informationReview on Methods of Selecting Number of Hidden Nodes in Artificial Neural Network
Available Online at www.ijcsmc.com International Journal of Computer Science and Mobile Computing A Monthly Journal of Computer Science and Information Technology IJCSMC, Vol. 3, Issue. 11, November 2014,
More informationLecture 20: Neural Networks for NLP. Zubin Pahuja
Lecture 20: Neural Networks for NLP Zubin Pahuja zpahuja2@illinois.edu courses.engr.illinois.edu/cs447 CS447: Natural Language Processing 1 Today s Lecture Feed-forward neural networks as classifiers simple
More informationAn Efficient Implementation of Multi Layer Perceptron Neural Network for Signal Processing
An Efficient Implementation of Multi Layer Perceptron Neural Network for Signal Processing A.THILAGAVATHY 1, M.E Vlsi Desgin(Pg Scholar), Srinivasan Engineering College, Perambalur-621 212, Tamilnadu,
More informationOptimizing Number of Hidden Nodes for Artificial Neural Network using Competitive Learning Approach
Available Online at www.ijcsmc.com International Journal of Computer Science and Mobile Computing A Monthly Journal of Computer Science and Information Technology IJCSMC, Vol. 4, Issue. 5, May 2015, pg.358
More informationCOMP 551 Applied Machine Learning Lecture 14: Neural Networks
COMP 551 Applied Machine Learning Lecture 14: Neural Networks Instructor: (jpineau@cs.mcgill.ca) Class web page: www.cs.mcgill.ca/~jpineau/comp551 Unless otherwise noted, all material posted for this course
More informationNatural Language Processing CS 6320 Lecture 6 Neural Language Models. Instructor: Sanda Harabagiu
Natural Language Processing CS 6320 Lecture 6 Neural Language Models Instructor: Sanda Harabagiu In this lecture We shall cover: Deep Neural Models for Natural Language Processing Introduce Feed Forward
More informationObject Detection Lecture Introduction to deep learning (CNN) Idar Dyrdal
Object Detection Lecture 10.3 - Introduction to deep learning (CNN) Idar Dyrdal Deep Learning Labels Computational models composed of multiple processing layers (non-linear transformations) Used to learn
More informationUsing neural nets to recognize hand-written digits. Srikumar Ramalingam School of Computing University of Utah
Using neural nets to recognize hand-written digits Srikumar Ramalingam School of Computing University of Utah Reference Most of the slides are taken from the first chapter of the online book by Michael
More informationMULTILAYER PERCEPTRON WITH ADAPTIVE ACTIVATION FUNCTIONS CHINMAY RANE. Presented to the Faculty of Graduate School of
MULTILAYER PERCEPTRON WITH ADAPTIVE ACTIVATION FUNCTIONS By CHINMAY RANE Presented to the Faculty of Graduate School of The University of Texas at Arlington in Partial Fulfillment of the Requirements for
More informationANALYSIS AND REVIEW OF THE CONTRIBUTION OF NEURAL NETWORKS TO SAVING ELECTRICITY IN RESIDENTIAL LIGHTING BY A DESIGN IN MATLAB
ANALYSIS AND REVIEW OF THE CONTRIBUTION OF NEURAL NETWORKS TO SAVING ELECTRICITY IN RESIDENTIAL LIGHTING BY A DESIGN IN MATLAB ANÁLISIS Y ESTUDIO DE LA CONTRIBUCIÓN DE LAS REDES NEURONALES AL AHORRO DE
More informationIndex. Umberto Michelucci 2018 U. Michelucci, Applied Deep Learning,
A Acquisition function, 298, 301 Adam optimizer, 175 178 Anaconda navigator conda command, 3 Create button, 5 download and install, 1 installing packages, 8 Jupyter Notebook, 11 13 left navigation pane,
More informationChiang Mai J. Sci. 2011; 38 (Special Issue) : Contributed Paper
Chiang Mai J. Sci. 2011; 38 (Special Issue) 123 Chiang Mai J. Sci. 2011; 38 (Special Issue) : 123-135 www.science.cmu.ac.th/journal-science/josci.html Contributed Paper A Comparative Analysis of Conjugate
More informationPREDICTING THE THERMAL PERFORMANCE FOR THE MULTI- OBJECTIVE VEHICLE UNDERHOOD PACKING OPTIMIZATION PROBLEM
Clemson University TigerPrints All Theses Theses 8-2011 PREDICTING THE THERMAL PERFORMANCE FOR THE MULTI- OBJECTIVE VEHICLE UNDERHOOD PACKING OPTIMIZATION PROBLEM Ravi teja Katragadda Clemson University,
More informationTraffic Signs Recognition using HP and HOG Descriptors Combined to MLP and SVM Classifiers
Traffic Signs Recognition using HP and HOG Descriptors Combined to MLP and SVM Classifiers A. Salhi, B. Minaoui, M. Fakir, H. Chakib, H. Grimech Faculty of science and Technology Sultan Moulay Slimane
More informationMulti Layer Perceptron with Back Propagation. User Manual
Multi Layer Perceptron with Back Propagation User Manual DAME-MAN-NA-0011 Issue: 1.3 Date: September 03, 2013 Author: S. Cavuoti, M. Brescia Doc. : MLPBP_UserManual_DAME-MAN-NA-0011-Rel1.3 1 INDEX 1 Introduction...
More informationArtificial Neural Networks Lecture Notes Part 5. Stephen Lucci, PhD. Part 5
Artificial Neural Networks Lecture Notes Part 5 About this file: If you have trouble reading the contents of this file, or in case of transcription errors, email gi0062@bcmail.brooklyn.cuny.edu Acknowledgments:
More informationDOUBLE-CURVED SURFACE FORMING PROCESS MODELING
7th International DAAAM Baltic Conference INDUSTRIAL ENGINEERING 22-24 April 2010, Tallinn, Estonia DOUBLE-CURVED SURFACE FORMING PROCESS MODELING Velsker, T.; Majak, J.; Eerme, M.; Pohlak, M. Abstract:
More informationAutomatic Deployment and Formation Control of Decentralized Multi-Agent Networks
Automatic Deployment and Formation Control of Decentralized Multi-Agent Netorks Brian S. Smith, Magnus Egerstedt, and Ayanna Hoard Abstract Novel tools are needed to deploy multi-agent netorks in applications
More informationCharacter Recognition Using Convolutional Neural Networks
Character Recognition Using Convolutional Neural Networks David Bouchain Seminar Statistical Learning Theory University of Ulm, Germany Institute for Neural Information Processing Winter 2006/2007 Abstract
More informationLecture 2 Notes. Outline. Neural Networks. The Big Idea. Architecture. Instructors: Parth Shah, Riju Pahwa
Instructors: Parth Shah, Riju Pahwa Lecture 2 Notes Outline 1. Neural Networks The Big Idea Architecture SGD and Backpropagation 2. Convolutional Neural Networks Intuition Architecture 3. Recurrent Neural
More informationCSC 578 Neural Networks and Deep Learning
CSC 578 Neural Networks and Deep Learning Fall 2018/19 7. Recurrent Neural Networks (Some figures adapted from NNDL book) 1 Recurrent Neural Networks 1. Recurrent Neural Networks (RNNs) 2. RNN Training
More informationEmploying ANFIS for Object Detection in Robo-Pong
Employing ANFIS for Object Detection in Robo-Pong R. Sabzevari 1, S. Masoumzadeh, and M. Rezaei Ghahroudi Department of Computer Engineering, Islamic Azad University of Qazvin, Qazvin, Iran 1 Member of
More informationAbalone Age Prediction using Artificial Neural Network
IOSR Journal o Computer Engineering (IOSR-JCE) e-issn: 2278-066,p-ISSN: 2278-8727, Volume 8, Issue 5, Ver. II (Sept - Oct. 206), PP 34-38 www.iosrjournals.org Abalone Age Prediction using Artiicial Neural
More informationRES 3000 Version 3.0 CA/PMS Installation and Setup Instructions
RES 3000 Version 3.0 CA/PMS Installation and Setup Instructions $ERXW7KLV'RFXPHQW This document provides installation and setup instructions for the CA/ PMS credit card driver. The type of CA/EDC Driver
More informationExtreme Learning Machines. Tony Oakden ANU AI Masters Project (early Presentation) 4/8/2014
Extreme Learning Machines Tony Oakden ANU AI Masters Project (early Presentation) 4/8/2014 This presentation covers: Revision of Neural Network theory Introduction to Extreme Learning Machines ELM Early
More informationIEEE TRANSACTIONS ON NEURAL NETWORKS, VOL. 16, NO. 2, MARCH
IEEE TRANSACTIONS ON NEURAL NETWORKS, VOL. 16, NO. 2, MARCH 2005 325 Linear-Least-Squares Initialization of Multilayer Perceptrons Through Backpropagation of the Desired Response Deniz Erdogmus, Member,
More informationInverse Analysis of Soil Parameters Based on Deformation of a Bank Protection Structure
Inverse Analysis of Soil Parameters Based on Deformation of a Bank Protection Structure Yixuan Xing 1, Rui Hu 2 *, Quan Liu 1 1 Geoscience Centre, University of Goettingen, Goettingen, Germany 2 School
More informationInternational Journal of Advanced Research in Computer Science and Software Engineering
Volume 3, Issue 4, April 203 ISSN: 77 2X International Journal of Advanced Research in Computer Science and Software Engineering Research Paper Available online at: www.ijarcsse.com Stock Market Prediction
More informationVideo Seamless Splicing Method Based on SURF Algorithm and Harris Corner Points Detection
Vol13 (Softech 016), pp138-14 http://dxdoiorg/101457/astl016137 Video Seamless Splicing Method Based on SURF Algorithm and Harris Corner Points Detection Dong Jing 1, Chen Dong, Jiang Shuen 3 1 College
More information