6. NEURAL NETWORK BASED PATH PLANNING ALGORITHM 6.1 INTRODUCTION
|
|
- Dwayne Owens
- 6 years ago
- Views:
Transcription
1 6 NEURAL NETWORK BASED PATH PLANNING ALGORITHM 61 INTRODUCTION In previous chapters path planning algorithms such as trigonometry based path planning algorithm and direction based path planning algorithm were developed which belongs to the category of conventional or traditional path planning methods The conventional methods work in a sequential way, which can perform only one task at a time, is performed These methods functions logically with the set of rules and calculations The conventional method learns by rules On the other hand neural network based methods have the ability of processing things in a parallel way and do many things at once The neural networks will learn by example While conventional methods can be programmed using any high level language, the neural networks can be self programmed A sensor based navigation scheme which makes use of a global representation of the environment by means of self organizing or Kohonen network is presented in [68] A neural map which offers a promising alternative to the distance transform and harmonic function methods for both global and local navigation is presented in [69] A neural network model learned from human driving data, introduced to model obstacle avoidance through dense areas of obstacles is presented in [70], and is tested in different scenarios, and compared using cross validation to determine the optimal network structure A method of construction of a collision free path for moving a robot among obstacles based on two neural networks is presented in [71] The path planning of a mobile robot by using a modified Hopfield neural network is presented in [77] A collision avoidance scheme is proposed in [80] for a multiple robot system Based on the above works a neural network based path planning algorithm is developed based on the parallel distributed neural network model in order to extinguish fire in both types of environment, ie, environments with and without obstacles
2 62 NEURAL NETWORKS 621 A SIMPLE NEURON An Artificial Neural Network (ANN) is a mathematical model inspired by structure as well as the functional aspects of biological neural networks [146] ANNs have been employed in various areas such as computing, medicine, engineering, economics, and many others ANNs are composed of a number of simple computational elements called neurons, organized into a structured graph topology made out of several consecutive layers and are interconnected through a series of links, called the synaptic weights Synaptic weights are often associated with variable numerical values, which can be adapted so as to allow the ANN to change its behavior based on the problem being tackled A simple neuron with a single R-element input vector is shown in Figure 621 P 1 W 1, 111 P 2 n F o b P R W 1,R11 Figure 621 Simple neuron Here the individual element inputs p 1,p 2,,p R are multiplied by weights
3 w 1,1,w 1,2 w 1,R and the weighted values are given as input to the summing junction Their sum is simply Wp, the dot product of the (single row) matrix W and the vector p 622 ARCHITECTURE OF NEURAL NETWORKS The Architecture of Neural Networks can be classified into a Feed forward neural network b Feed backward neural network These network architectures can be either simulated using software or implemented using hardware a Feed-forward neural network Feed-forward ANNs allow signals to travel only in one direction ie, from the input to the output There is no feedback in the feed forward network; ie, the output of any layer does not affect that same layer Feed forward neural network is shown in Figure 622 Feed-forward ANNs tend to be straight-forward networks that associate inputs with outputs They are widely used in pattern recognition This type of organization is also referred to as bottom-up or topdown Feed forward networks are static networks in the sense, that given an input value they produce only one set of output values not a sequence of values Feed forward networks are memory less networks in the sense, that the output of a feed forward network is not dependent on the previous state of the network
4 Figure 622 Feed forward neural network b Feedback neural network Feedback networks can have signals travelling in both directions by introducing loops in the network Feed backward neural network is shown in Figure 623 Feedback networks are very powerful and can get extremely complicated Feedback networks are dynamic in the sense, which their state changes continuously until they reach an equilibrium point They remain at the equilibrium point until the input changes and a new equilibrium needs to be found Feedback networks are also referred to as interactive or recurrent networks The term recurrent is often used to denote feedback connections in single-layer organizations
5 Figure 623 Feed backward neural network 623 LEARNING IN NEURAL NETWORKS All learning methods used for adaptive neural networks can be classified into two major categories: a Supervised learning b Unsupervised learning a Supervised learning In supervised learning, both the inputs and outputs are provided The network then processes the inputs and compares the resulting outputs against the desired outputs Errors are then propagated back through the system, causing it to adjust the weights, which control the network This process occurs again and again, and the weights are continually changed till convergence The set of data, which enables the training, is called the "training set" During the training of a network, the same set of data is processed many times, as the connection weights are ever refined An important issue concerning supervised learning is the problem of error convergence, which is the minimization of error between the desired and the computed unit values The aim is to determine a set of weights which minimizes the error One well-known method, which is common to many learning paradigms, is the least mean square (LMS) convergence b Unsupervised learning In this type, the network is provided with inputs, but not with the desired outputs The system itself must then decide what features it will use to cluster the input data This is referred to as self-organization or adaptation These networks use no external influences to adjust their weights Instead, they monitor their performance internally These networks look for regularities or
6 trends in the input signals, and make adaptations according to the function of the network Even without being told whether it's right or wrong, the network still must have some information about how to organize itself This information is built into the network topology and learning rules An unsupervised learning algorithm might emphasize cooperation among clusters of processing elements In such a scheme, the clusters would work together Examples of unsupervised learning are hebbian learning and competitive learning Human neurons are different from the artificial neurons in that the aspect of learning concerns the distinction or not of a separate phase, during which the network is trained, and a subsequent operation phase We say that a neural network learns off-line, if the learning phase and the operation phase are distinct A neural network learns on-line if it learns and operates at the same time Usually, supervised learning is performed off-line, whereas unsupervised learning is performed on-line 624 TRANSFER FUNCTION The result of the summation function is transformed into an output through an algorithmic process, known as the transfer function In the transfer function the summation can be compared with some threshold to determine the neural output If the sum is greater than the threshold value, the processing element generates a signal, and if it is less than the threshold, no signal is generated Both types of responses are significant The transfer function is classified into: a linear transfer function b threshold transfer function c sigmoid transfer function a Linear transfer function
7 For linear units, the output activity is proportional to the total weighted output Linear transfer function is shown in Figure 624 Figure 624 Linear transfer function b Threshold transfer function For threshold units, the outputs are set at one of two levels, depending on whether the total input is greater than or less than some threshold value Threshold transfer function is shown in Figure 615 y 1 0 T x Figure 625 Threshold transfer function c Sigmoid transfer function For sigmoid units, the output varies continuously but not linearly as the input changes Sigmoid units bear a greater resemblance to real neurons
8 than do linear or threshold units, but all three must be considered as rough approximations Sigmoid transfer function is shown in Figure Figure 626 sigmoid transfer function 63 ASSUMPTIONS USED IN THE MODEL 1 The forest domain is decomposed into M x N grids of square cells 2 The forest domain decomposition into 20 x 20 grids of square cells is shown in Figure Each cell in the grid contains an anchor sensor node which knows the location based on integers 4 The actor (Robot) is available at cell 1, which is always the start cell and the cell in which fire occurs is always the goal cell 5 Once a fire occurs inside a particular cell, it will be detected by the sensor placed inside the cell first, and the sensor sends a message containing the coordinates of the cell to the actor Thus, the actor knows both the start and goal cells Then it uses
9 the algorithm implemented using the neural network to find a path Figure 631 Decomposition of the forest using 20 x 20 grids with coordinates based on integers 6 Obstacles are static and the size of the obstacle is similar to the size of the cell 7 Two adjacent cells will have obstacles either lengthwise or breadth wise, but not combined 8 Since the actor is available at cell 1, and based on assumptions 6 and 7, only 3 movements are sufficient to navigate the entire domain They are (i) UP denoted by 0 (ii) DIAGONAL denoted by 1 and (iii) LEFT denoted by 2, as shown in Figure 632, where CPA denotes the Current position/cell of the Actor
10 1 0 2 CPAA Figure 632 Directional movements of the actor 64 PATH PLANNING ALGORITHM The actor placed in the cell whose location is 1 uses the algorithm shown below to estimate the sequence of points which does not contain any obstacle Then, it will move through these points to reach the goal cell where the fire has occurred and start to extinguish it The algorithm is shown below: 641 Algorithm for estimating the path Let s be the start position, g be the goal position and n be the number of cells in a row or column location = 0 Store the start position in the first location of the memory path while (s not equal to g) location =location +1 if ((absolute value (s-g)) mod (n+1) = 0) then if (location s+n+1 contain obstacle) then
11 Check up move cell; ie, s+n and left move cell s+1 If (both cells do not contain an obstacle or left move cell contains an obstacle) then s = s+n If (up move cell only contains an obstacle) then s = s +1 else s = s + n+1 else if ((absolute value (s-g)) mod n = 0) then if (location s+n contain obstacle) then Check diagonal move cell; ie, s+n+1 and left move cell s+1 If (either cells do not contain obstacle or left move cell contains an obstacle) then
12 s = s+n+1 If (diagonal move cell only contains obstacle) then s = s +1 else s = s + n else if (location s+1 contain obstacle) then Check up move cell; ie, s+n and diagonal move cell s+n+1 If (both cells do not contain an obstacle or up move cell contain obstacle) then s = s+n+1
13 If (diagonal move cell only contains obstacle) then s = s + n else s= s + 1 Store the point s in location 642 Parallel distributed neural network model: The parallel distributed neural network model is shown in Figure 642 It uses 3 neurons which take two inputs cell s multiplied by the weight and bias and sums them It uses reinforcement learning; ie, the network is designed in such a way that each time the best next move will be selected out of three possible moves The move selected means it is rewarded, and moves not selected mean they are punished The weights used in the model are binary weights and takes the value of either 0 or 1The weight will be calculated for each neuron separately using the formula shown below: w1 = 1 if ((abs(s-g)) mod (n+1) = 0) 0 other wise
14 w2 = 1 if ((abs(s-g)) mod n = 0) 0 other wise w3 = 1 if (((abs(s-g)) mod (n+1)! = 0) && ((abs(s-g)) mod n! = 0)) 0 other wise abs(x) is a function which takes an integer argument x, which can be either positive or negative, and returns a positive value of the argument n+1 s = o, store s W1 AF1 No n s W2 AF2 O o has obstacle 1 W3 AF3 Yes Check for next cells without obstacle and assign to s store s Figure 642 Parallel distributed neural network model where s - the starting cell in the first iteration and it is the next sequence of cells where the actor has to move, calculated in the next iterations g - it is the goal cell where the fire has occurred
15 n - the number of cells in a row or column of the decomposed forest domain w1, w2, w3 - weights connected to the neurons AF1,AF2,AF3 - Activation functions for neurons 1, 2 and 3 respectively o - Net output of the Activation functions summed together The output of each neuron is calculated as follows: Output of neuron1 = s+n+1 if w1 = 1 n+1 if w1 = 0 Output of neuron2 = s+n if w2 = 1 n if w2 = 0 Output of neuron3 = s+1 if w3 = 1 1 if w3 = 0 The weight will vary from iteration to iteration, due to a change in the value of s in each iteration The cell g is always constant, and it is not shown explicitly in the model shown below: The model also uses 3 activation functions, which are calculated as AF1 = s+n+1 if output of neuron1 is s+n+1 0 otherwise AF2 = s+n if output of neuron 2 is s+n
16 0 otherwise AF3 = s+1 if output of neuron 3 is s+1 0 otherwise Initially the input value s is fed to the neural network It is assumed that g is available as an environment variable, as it is constant The weight will be calculated for each neuron separately The output of each neuron is fed as input to the activation function The net value of three activation functions decides which move is selected Then the selected move is tested for the presence of an obstacle If there is no obstacle in the cell selected for the next move, then stores the cell number and assigns the cell number to s If any obstacle is present in the cell selected for next move, then the two cells obtained using the remaining two movements will be checked for obstacles If anyone cell is free (definitely one cell will be free because of the assumptions of the shape of the obstacles) then store the cell number in the memory and assign the cell number to s The process is repeated with the new s till the cell number of s is the same as that of cell g The actor will use the sequence of cell numbers stored in the memory to reach the cell where the fire has occurred, and extinguish it by suitable means once the computation of the path using the cell numbers is complete 65 Simulation Results In this work the forest domain is considered as a grid decomposed into m x n cells based on integer values The developed model is assumed to work with single instance of fire occurrence, since for the entire forest domain the assumption is only one actor is available The path planning algorithm developed is based on the parallel distributed neural network model for the actor in order to extinguish fire in both types of environment, ie, environments with and without obstacles The software java is used for simulation purposes The
17 algorithm was implemented on a 20 x 20 grid and was executed for 100 times To test the effectiveness of the proposed algorithm, the fire is created in various cells by varying start and end coordinates including all the quadrant regions, horizontal lines and vertical lines The number of obstacles is also varied This is achieved by properly designed test cases The test cases are designed in such a way that 123 statements, 6 independent paths and 1 loop in the program get executed at least once The actor is represented by a green square located at the top left corner, the cells containing obstacles are represented using black color, and the cell where the fire occurs is shown in red, and a line in red color shows the path planning of the actor to travel and reach the target area to extinguish the fire The simulated results are shown in Figures 651 and 652 Figure 651 Environment without obstacles
18 Figure 652 Environment with Obstacles
A Neural Network based Path Planning Algorithm for Extinguishing Forest Fires
www.ijcsi.org 563 A Neural Network based Path Planning Algorithm for Extinguishing Forest Fires M.P.Sivaram Kumar 1, S.Rajasekaran 2 1 Research Scholar/ Department of CSE B S Abdur Rahman University Vandalur,
More informationMultilayer Feed-forward networks
Multi Feed-forward networks 1. Computational models of McCulloch and Pitts proposed a binary threshold unit as a computational model for artificial neuron. This first type of neuron has been generalized
More informationDr. Qadri Hamarsheh Supervised Learning in Neural Networks (Part 1) learning algorithm Δwkj wkj Theoretically practically
Supervised Learning in Neural Networks (Part 1) A prescribed set of well-defined rules for the solution of a learning problem is called a learning algorithm. Variety of learning algorithms are existing,
More informationYuki Osada Andrew Cannon
Yuki Osada Andrew Cannon 1 Humans are an intelligent species One feature is the ability to learn The ability to learn comes down to the brain The brain learns from experience Research shows that the brain
More informationArtificial neural networks are the paradigm of connectionist systems (connectionism vs. symbolism)
Artificial Neural Networks Analogy to biological neural systems, the most robust learning systems we know. Attempt to: Understand natural biological systems through computational modeling. Model intelligent
More informationInstantaneously trained neural networks with complex inputs
Louisiana State University LSU Digital Commons LSU Master's Theses Graduate School 2003 Instantaneously trained neural networks with complex inputs Pritam Rajagopal Louisiana State University and Agricultural
More informationSeismic regionalization based on an artificial neural network
Seismic regionalization based on an artificial neural network *Jaime García-Pérez 1) and René Riaño 2) 1), 2) Instituto de Ingeniería, UNAM, CU, Coyoacán, México D.F., 014510, Mexico 1) jgap@pumas.ii.unam.mx
More informationCMPT 882 Week 3 Summary
CMPT 882 Week 3 Summary! Artificial Neural Networks (ANNs) are networks of interconnected simple units that are based on a greatly simplified model of the brain. ANNs are useful learning tools by being
More informationReview on Methods of Selecting Number of Hidden Nodes in Artificial Neural Network
Available Online at www.ijcsmc.com International Journal of Computer Science and Mobile Computing A Monthly Journal of Computer Science and Information Technology IJCSMC, Vol. 3, Issue. 11, November 2014,
More informationImage Compression: An Artificial Neural Network Approach
Image Compression: An Artificial Neural Network Approach Anjana B 1, Mrs Shreeja R 2 1 Department of Computer Science and Engineering, Calicut University, Kuttippuram 2 Department of Computer Science and
More informationCLASSIFICATION WITH RADIAL BASIS AND PROBABILISTIC NEURAL NETWORKS
CLASSIFICATION WITH RADIAL BASIS AND PROBABILISTIC NEURAL NETWORKS CHAPTER 4 CLASSIFICATION WITH RADIAL BASIS AND PROBABILISTIC NEURAL NETWORKS 4.1 Introduction Optical character recognition is one of
More informationCHAPTER 7 MASS LOSS PREDICTION USING ARTIFICIAL NEURAL NETWORK (ANN)
128 CHAPTER 7 MASS LOSS PREDICTION USING ARTIFICIAL NEURAL NETWORK (ANN) Various mathematical techniques like regression analysis and software tools have helped to develop a model using equation, which
More informationLECTURE NOTES Professor Anita Wasilewska NEURAL NETWORKS
LECTURE NOTES Professor Anita Wasilewska NEURAL NETWORKS Neural Networks Classifier Introduction INPUT: classification data, i.e. it contains an classification (class) attribute. WE also say that the class
More informationClimate Precipitation Prediction by Neural Network
Journal of Mathematics and System Science 5 (205) 207-23 doi: 0.7265/259-529/205.05.005 D DAVID PUBLISHING Juliana Aparecida Anochi, Haroldo Fraga de Campos Velho 2. Applied Computing Graduate Program,
More informationWHAT TYPE OF NEURAL NETWORK IS IDEAL FOR PREDICTIONS OF SOLAR FLARES?
WHAT TYPE OF NEURAL NETWORK IS IDEAL FOR PREDICTIONS OF SOLAR FLARES? Initially considered for this model was a feed forward neural network. Essentially, this means connections between units do not form
More informationNeural Networks. CE-725: Statistical Pattern Recognition Sharif University of Technology Spring Soleymani
Neural Networks CE-725: Statistical Pattern Recognition Sharif University of Technology Spring 2013 Soleymani Outline Biological and artificial neural networks Feed-forward neural networks Single layer
More informationNeural Networks CMSC475/675
Introduction to Neural Networks CMSC475/675 Chapter 1 Introduction Why ANN Introduction Some tasks can be done easily (effortlessly) by humans but are hard by conventional paradigms on Von Neumann machine
More information11/14/2010 Intelligent Systems and Soft Computing 1
Lecture 7 Artificial neural networks: Supervised learning Introduction, or how the brain works The neuron as a simple computing element The perceptron Multilayer neural networks Accelerated learning in
More informationNatural Language Processing CS 6320 Lecture 6 Neural Language Models. Instructor: Sanda Harabagiu
Natural Language Processing CS 6320 Lecture 6 Neural Language Models Instructor: Sanda Harabagiu In this lecture We shall cover: Deep Neural Models for Natural Language Processing Introduce Feed Forward
More informationAssignment # 5. Farrukh Jabeen Due Date: November 2, Neural Networks: Backpropation
Farrukh Jabeen Due Date: November 2, 2009. Neural Networks: Backpropation Assignment # 5 The "Backpropagation" method is one of the most popular methods of "learning" by a neural network. Read the class
More informationCOMPUTATIONAL INTELLIGENCE
COMPUTATIONAL INTELLIGENCE Fundamentals Adrian Horzyk Preface Before we can proceed to discuss specific complex methods we have to introduce basic concepts, principles, and models of computational intelligence
More informationCHAPTER VI BACK PROPAGATION ALGORITHM
6.1 Introduction CHAPTER VI BACK PROPAGATION ALGORITHM In the previous chapter, we analysed that multiple layer perceptrons are effectively applied to handle tricky problems if trained with a vastly accepted
More informationConstructive Neural Network Algorithm for Breast Cancer Detection
Constructive Neural Network Algorithm for Breast Cancer Detection N.Jaisankar 1, B.Sathya 2, S.Chellaganeshavalli 3 1 Professor, ECE, MNM Jain Engineering College, Chennai, India. 2 Assistant Professor,
More informationDesign and Performance Analysis of and Gate using Synaptic Inputs for Neural Network Application
IJIRST International Journal for Innovative Research in Science & Technology Volume 1 Issue 12 May 2015 ISSN (online): 2349-6010 Design and Performance Analysis of and Gate using Synaptic Inputs for Neural
More informationData Mining. Neural Networks
Data Mining Neural Networks Goals for this Unit Basic understanding of Neural Networks and how they work Ability to use Neural Networks to solve real problems Understand when neural networks may be most
More informationPath Planning. Marcello Restelli. Dipartimento di Elettronica e Informazione Politecnico di Milano tel:
Marcello Restelli Dipartimento di Elettronica e Informazione Politecnico di Milano email: restelli@elet.polimi.it tel: 02 2399 3470 Path Planning Robotica for Computer Engineering students A.A. 2006/2007
More informationIn this assignment, we investigated the use of neural networks for supervised classification
Paul Couchman Fabien Imbault Ronan Tigreat Gorka Urchegui Tellechea Classification assignment (group 6) Image processing MSc Embedded Systems March 2003 Classification includes a broad range of decision-theoric
More informationAutonomous Mobile Robots, Chapter 6 Planning and Navigation Where am I going? How do I get there? Localization. Cognition. Real World Environment
Planning and Navigation Where am I going? How do I get there?? Localization "Position" Global Map Cognition Environment Model Local Map Perception Real World Environment Path Motion Control Competencies
More informationTraffic Signs Recognition using HP and HOG Descriptors Combined to MLP and SVM Classifiers
Traffic Signs Recognition using HP and HOG Descriptors Combined to MLP and SVM Classifiers A. Salhi, B. Minaoui, M. Fakir, H. Chakib, H. Grimech Faculty of science and Technology Sultan Moulay Slimane
More informationOpening the Black Box Data Driven Visualizaion of Neural N
Opening the Black Box Data Driven Visualizaion of Neural Networks September 20, 2006 Aritificial Neural Networks Limitations of ANNs Use of Visualization (ANNs) mimic the processes found in biological
More informationII. ARTIFICIAL NEURAL NETWORK
Applications of Artificial Neural Networks in Power Systems: A Review Harsh Sareen 1, Palak Grover 2 1, 2 HMR Institute of Technology and Management Hamidpur New Delhi, India Abstract: A standout amongst
More informationData Mining and Analytics
Data Mining and Analytics Aik Choon Tan, Ph.D. Associate Professor of Bioinformatics Division of Medical Oncology Department of Medicine aikchoon.tan@ucdenver.edu 9/22/2017 http://tanlab.ucdenver.edu/labhomepage/teaching/bsbt6111/
More informationMore on Learning. Neural Nets Support Vectors Machines Unsupervised Learning (Clustering) K-Means Expectation-Maximization
More on Learning Neural Nets Support Vectors Machines Unsupervised Learning (Clustering) K-Means Expectation-Maximization Neural Net Learning Motivated by studies of the brain. A network of artificial
More informationReview: Final Exam CPSC Artificial Intelligence Michael M. Richter
Review: Final Exam Model for a Learning Step Learner initially Environm ent Teacher Compare s pe c ia l Information Control Correct Learning criteria Feedback changed Learner after Learning Learning by
More informationKINEMATIC ANALYSIS OF ADEPT VIPER USING NEURAL NETWORK
Proceedings of the National Conference on Trends and Advances in Mechanical Engineering, YMCA Institute of Engineering, Faridabad, Haryana., Dec 9-10, 2006. KINEMATIC ANALYSIS OF ADEPT VIPER USING NEURAL
More informationEE631 Cooperating Autonomous Mobile Robots
EE631 Cooperating Autonomous Mobile Robots Lecture 3: Path Planning Algorithm Prof. Yi Guo ECE Dept. Plan Representing the Space Path Planning Methods A* Search Algorithm D* Search Algorithm Representing
More informationBasic Idea. The routing problem is typically solved using a twostep
Global Routing Basic Idea The routing problem is typically solved using a twostep approach: Global Routing Define the routing regions. Generate a tentative route for each net. Each net is assigned to a
More informationFunction approximation using RBF network. 10 basis functions and 25 data points.
1 Function approximation using RBF network F (x j ) = m 1 w i ϕ( x j t i ) i=1 j = 1... N, m 1 = 10, N = 25 10 basis functions and 25 data points. Basis function centers are plotted with circles and data
More informationMachine Learning 13. week
Machine Learning 13. week Deep Learning Convolutional Neural Network Recurrent Neural Network 1 Why Deep Learning is so Popular? 1. Increase in the amount of data Thanks to the Internet, huge amount of
More information1. Introduction. 2. Motivation and Problem Definition. Volume 8 Issue 2, February Susmita Mohapatra
Pattern Recall Analysis of the Hopfield Neural Network with a Genetic Algorithm Susmita Mohapatra Department of Computer Science, Utkal University, India Abstract: This paper is focused on the implementation
More informationArtificial Neural Networks Lecture Notes Part 5. Stephen Lucci, PhD. Part 5
Artificial Neural Networks Lecture Notes Part 5 About this file: If you have trouble reading the contents of this file, or in case of transcription errors, email gi0062@bcmail.brooklyn.cuny.edu Acknowledgments:
More informationIntroduction to Neural Networks
Introduction to Neural Networks What are connectionist neural networks? Connectionism refers to a computer modeling approach to computation that is loosely based upon the architecture of the brain Many
More informationEnsemble methods in machine learning. Example. Neural networks. Neural networks
Ensemble methods in machine learning Bootstrap aggregating (bagging) train an ensemble of models based on randomly resampled versions of the training set, then take a majority vote Example What if you
More informationCS6220: DATA MINING TECHNIQUES
CS6220: DATA MINING TECHNIQUES Image Data: Classification via Neural Networks Instructor: Yizhou Sun yzsun@ccs.neu.edu November 19, 2015 Methods to Learn Classification Clustering Frequent Pattern Mining
More informationSupervised vs.unsupervised Learning
Supervised vs.unsupervised Learning In supervised learning we train algorithms with predefined concepts and functions based on labeled data D = { ( x, y ) x X, y {yes,no}. In unsupervised learning we are
More information4.12 Generalization. In back-propagation learning, as many training examples as possible are typically used.
1 4.12 Generalization In back-propagation learning, as many training examples as possible are typically used. It is hoped that the network so designed generalizes well. A network generalizes well when
More information11/14/2010 Intelligent Systems and Soft Computing 1
Lecture 8 Artificial neural networks: Unsupervised learning Introduction Hebbian learning Generalised Hebbian learning algorithm Competitive learning Self-organising computational map: Kohonen network
More informationCHAPTER 6 PERCEPTUAL ORGANIZATION BASED ON TEMPORAL DYNAMICS
CHAPTER 6 PERCEPTUAL ORGANIZATION BASED ON TEMPORAL DYNAMICS This chapter presents a computational model for perceptual organization. A figure-ground segregation network is proposed based on a novel boundary
More informationA novel firing rule for training Kohonen selforganising
A novel firing rule for training Kohonen selforganising maps D. T. Pham & A. B. Chan Manufacturing Engineering Centre, School of Engineering, University of Wales Cardiff, P.O. Box 688, Queen's Buildings,
More informationNeural Networks (pp )
Notation: Means pencil-and-paper QUIZ Means coding QUIZ Neural Networks (pp. 106-121) The first artificial neural network (ANN) was the (single-layer) perceptron, a simplified model of a biological neuron.
More informationUnsupervised Learning
Networks for Pattern Recognition, 2014 Networks for Single Linkage K-Means Soft DBSCAN PCA Networks for Kohonen Maps Linear Vector Quantization Networks for Problems/Approaches in Machine Learning Supervised
More informationSimulation of Zhang Suen Algorithm using Feed- Forward Neural Networks
Simulation of Zhang Suen Algorithm using Feed- Forward Neural Networks Ritika Luthra Research Scholar Chandigarh University Gulshan Goyal Associate Professor Chandigarh University ABSTRACT Image Skeletonization
More informationAnalytical model A structure and process for analyzing a dataset. For example, a decision tree is a model for the classification of a dataset.
Glossary of data mining terms: Accuracy Accuracy is an important factor in assessing the success of data mining. When applied to data, accuracy refers to the rate of correct values in the data. When applied
More informationSimulation of Back Propagation Neural Network for Iris Flower Classification
American Journal of Engineering Research (AJER) e-issn: 2320-0847 p-issn : 2320-0936 Volume-6, Issue-1, pp-200-205 www.ajer.org Research Paper Open Access Simulation of Back Propagation Neural Network
More informationCS 4510/9010 Applied Machine Learning
CS 4510/9010 Applied Machine Learning Neural Nets Paula Matuszek Spring, 2015 1 Neural Nets, the very short version A neural net consists of layers of nodes, or neurons, each of which has an activation
More informationCS 4510/9010 Applied Machine Learning. Neural Nets. Paula Matuszek Fall copyright Paula Matuszek 2016
CS 4510/9010 Applied Machine Learning 1 Neural Nets Paula Matuszek Fall 2016 Neural Nets, the very short version 2 A neural net consists of layers of nodes, or neurons, each of which has an activation
More informationTopics in AI (CPSC 532L): Multimodal Learning with Vision, Language and Sound. Lecture 12: Deep Reinforcement Learning
Topics in AI (CPSC 532L): Multimodal Learning with Vision, Language and Sound Lecture 12: Deep Reinforcement Learning Types of Learning Supervised training Learning from the teacher Training data includes
More informationSupervised Learning in Neural Networks (Part 2)
Supervised Learning in Neural Networks (Part 2) Multilayer neural networks (back-propagation training algorithm) The input signals are propagated in a forward direction on a layer-bylayer basis. Learning
More informationWeek 3: Perceptron and Multi-layer Perceptron
Week 3: Perceptron and Multi-layer Perceptron Phong Le, Willem Zuidema November 12, 2013 Last week we studied two famous biological neuron models, Fitzhugh-Nagumo model and Izhikevich model. This week,
More informationThe Traveling Salesman
Neural Network Approach To Solving The Traveling Salesman Problem The Traveling Salesman The shortest route for a salesman to visit every city, without stopping at the same city twice. 1 Random Methods
More informationUsing Machine Learning to Optimize Storage Systems
Using Machine Learning to Optimize Storage Systems Dr. Kiran Gunnam 1 Outline 1. Overview 2. Building Flash Models using Logistic Regression. 3. Storage Object classification 4. Storage Allocation recommendation
More informationMulti-Operand Addition Ivor Page 1
Multi-Operand Addition 1 Multi-Operand Addition Ivor Page 1 9.1 Motivation The motivation for multi-operand adders comes from the need for innerproduct calculations and multiplication (summing the partial
More informationNeural Network Weight Selection Using Genetic Algorithms
Neural Network Weight Selection Using Genetic Algorithms David Montana presented by: Carl Fink, Hongyi Chen, Jack Cheng, Xinglong Li, Bruce Lin, Chongjie Zhang April 12, 2005 1 Neural Networks Neural networks
More informationturning data into dollars
turning data into dollars Tom s Ten Data Tips November 2008 Neural Networks Neural Networks (NNs) are sometimes considered the epitome of data mining algorithms. Loosely modeled after the human brain (hence
More informationTHE NEURAL NETWORKS: APPLICATION AND OPTIMIZATION APPLICATION OF LEVENBERG-MARQUARDT ALGORITHM FOR TIFINAGH CHARACTER RECOGNITION
International Journal of Science, Environment and Technology, Vol. 2, No 5, 2013, 779 786 ISSN 2278-3687 (O) THE NEURAL NETWORKS: APPLICATION AND OPTIMIZATION APPLICATION OF LEVENBERG-MARQUARDT ALGORITHM
More informationFeedback Alignment Algorithms. Lisa Zhang, Tingwu Wang, Mengye Ren
Feedback Alignment Algorithms Lisa Zhang, Tingwu Wang, Mengye Ren Agenda Review of Back Propagation Random feedback weights support learning in deep neural networks Direct Feedback Alignment Provides Learning
More informationFigure (5) Kohonen Self-Organized Map
2- KOHONEN SELF-ORGANIZING MAPS (SOM) - The self-organizing neural networks assume a topological structure among the cluster units. - There are m cluster units, arranged in a one- or two-dimensional array;
More informationInternational Journal of Scientific Research & Engineering Trends Volume 4, Issue 6, Nov-Dec-2018, ISSN (Online): X
Analysis about Classification Techniques on Categorical Data in Data Mining Assistant Professor P. Meena Department of Computer Science Adhiyaman Arts and Science College for Women Uthangarai, Krishnagiri,
More informationReservoir Computing with Emphasis on Liquid State Machines
Reservoir Computing with Emphasis on Liquid State Machines Alex Klibisz University of Tennessee aklibisz@gmail.com November 28, 2016 Context and Motivation Traditional ANNs are useful for non-linear problems,
More informationMachine Learning : Clustering, Self-Organizing Maps
Machine Learning Clustering, Self-Organizing Maps 12/12/2013 Machine Learning : Clustering, Self-Organizing Maps Clustering The task: partition a set of objects into meaningful subsets (clusters). The
More informationARTIFICIAL NEURAL NETWORKS IN PATTERN RECOGNITION
ARTIFICIAL NEURAL NETWORKS IN PATTERN RECOGNITION Mohammadreza Yadollahi, Aleš Procházka Institute of Chemical Technology, Department of Computing and Control Engineering Abstract Pre-processing stages
More informationRecurrent Neural Network Models for improved (Pseudo) Random Number Generation in computer security applications
Recurrent Neural Network Models for improved (Pseudo) Random Number Generation in computer security applications D.A. Karras 1 and V. Zorkadis 2 1 University of Piraeus, Dept. of Business Administration,
More informationNeural Network Learning. Today s Lecture. Continuation of Neural Networks. Artificial Neural Networks. Lecture 24: Learning 3. Victor R.
Lecture 24: Learning 3 Victor R. Lesser CMPSCI 683 Fall 2010 Today s Lecture Continuation of Neural Networks Artificial Neural Networks Compose of nodes/units connected by links Each link has a numeric
More informationUse of Artificial Neural Networks to Investigate the Surface Roughness in CNC Milling Machine
Use of Artificial Neural Networks to Investigate the Surface Roughness in CNC Milling Machine M. Vijay Kumar Reddy 1 1 Department of Mechanical Engineering, Annamacharya Institute of Technology and Sciences,
More informationA motion planning method for mobile robot considering rotational motion in area coverage task
Asia Pacific Conference on Robot IoT System Development and Platform 018 (APRIS018) A motion planning method for mobile robot considering rotational motion in area coverage task Yano Taiki 1,a) Takase
More informationChapter 5 Components for Evolution of Modular Artificial Neural Networks
Chapter 5 Components for Evolution of Modular Artificial Neural Networks 5.1 Introduction In this chapter, the methods and components used for modular evolution of Artificial Neural Networks (ANNs) are
More informationMobile Robots: An Introduction.
Mobile Robots: An Introduction Amirkabir University of Technology Computer Engineering & Information Technology Department http://ce.aut.ac.ir/~shiry/lecture/robotics-2004/robotics04.html Introduction
More informationNeural Networks In Data Mining
Neural Networks In Mining Abstract-The application of neural networks in the data mining has become wider. Although neural networks may have complex structure, long training time, and uneasily understandable
More informationUNIVERSITY OF NORTH CAROLINA AT CHARLOTTE
UNIVERSITY OF NORTH CAROLINA AT CHARLOTTE Department of Electrical and Computer Engineering ECGR 4161/5196 Introduction to Robotics Experiment No. 5 A* Path Planning Overview: The purpose of this experiment
More informationObject Detection Lecture Introduction to deep learning (CNN) Idar Dyrdal
Object Detection Lecture 10.3 - Introduction to deep learning (CNN) Idar Dyrdal Deep Learning Labels Computational models composed of multiple processing layers (non-linear transformations) Used to learn
More informationBack propagation Algorithm:
Network Neural: A neural network is a class of computing system. They are created from very simple processing nodes formed into a network. They are inspired by the way that biological systems such as the
More informationLogical Rhythm - Class 3. August 27, 2018
Logical Rhythm - Class 3 August 27, 2018 In this Class Neural Networks (Intro To Deep Learning) Decision Trees Ensemble Methods(Random Forest) Hyperparameter Optimisation and Bias Variance Tradeoff Biological
More informationWhy Normalizing v? Why did we normalize v on the right side? Because we want the length of the left side to be the eigenvalue
Why Normalizing v? Why did we normalize v on the right side? Because we want the length of the left side to be the eigenvalue 1 CCI PCA Algorithm (1) 2 CCI PCA Algorithm (2) 3 PCA from the FERET Face Image
More informationArgha Roy* Dept. of CSE Netaji Subhash Engg. College West Bengal, India.
Volume 3, Issue 3, March 2013 ISSN: 2277 128X International Journal of Advanced Research in Computer Science and Software Engineering Research Paper Available online at: www.ijarcsse.com Training Artificial
More informationMathematics Curriculum
6 G R A D E Mathematics Curriculum GRADE 6 5 Table of Contents 1... 1 Topic A: Area of Triangles, Quadrilaterals, and Polygons (6.G.A.1)... 11 Lesson 1: The Area of Parallelograms Through Rectangle Facts...
More informationCharacter Recognition Using Convolutional Neural Networks
Character Recognition Using Convolutional Neural Networks David Bouchain Seminar Statistical Learning Theory University of Ulm, Germany Institute for Neural Information Processing Winter 2006/2007 Abstract
More informationCL7204-SOFT COMPUTING TECHNIQUES
VALLIAMMAI ENGINEERING COLLEGE 2015-2016(EVEN) [DOCUMENT TITLE] CL7204-SOFT COMPUTING TECHNIQUES UNIT I Prepared b Ms. Z. Jenifer A. P(O.G) QUESTION BANK INTRODUCTION AND NEURAL NETWORKS 1. What is soft
More informationMATLAB representation of neural network Outline Neural network with single-layer of neurons. Neural network with multiple-layer of neurons.
MATLAB representation of neural network Outline Neural network with single-layer of neurons. Neural network with multiple-layer of neurons. Introduction: Neural Network topologies (Typical Architectures)
More informationModule 1 Lecture Notes 2. Optimization Problem and Model Formulation
Optimization Methods: Introduction and Basic concepts 1 Module 1 Lecture Notes 2 Optimization Problem and Model Formulation Introduction In the previous lecture we studied the evolution of optimization
More informationLinear Separability. Linear Separability. Capabilities of Threshold Neurons. Capabilities of Threshold Neurons. Capabilities of Threshold Neurons
Linear Separability Input space in the two-dimensional case (n = ): - - - - - - w =, w =, = - - - - - - w = -, w =, = - - - - - - w = -, w =, = Linear Separability So by varying the weights and the threshold,
More informationFingerprint Identification System Based On Neural Network
Fingerprint Identification System Based On Neural Network Mr. Lokhande S.K., Prof. Mrs. Dhongde V.S. ME (VLSI & Embedded Systems), Vishwabharati Academy s College of Engineering, Ahmednagar (MS), India
More informationDeep Learning. Architecture Design for. Sargur N. Srihari
Architecture Design for Deep Learning Sargur N. srihari@cedar.buffalo.edu 1 Topics Overview 1. Example: Learning XOR 2. Gradient-Based Learning 3. Hidden Units 4. Architecture Design 5. Backpropagation
More informationSEMANTIC COMPUTING. Lecture 8: Introduction to Deep Learning. TU Dresden, 7 December Dagmar Gromann International Center For Computational Logic
SEMANTIC COMPUTING Lecture 8: Introduction to Deep Learning Dagmar Gromann International Center For Computational Logic TU Dresden, 7 December 2018 Overview Introduction Deep Learning General Neural Networks
More informationPattern Classification Algorithms for Face Recognition
Chapter 7 Pattern Classification Algorithms for Face Recognition 7.1 Introduction The best pattern recognizers in most instances are human beings. Yet we do not completely understand how the brain recognize
More informationCOMPUTER SIMULATION OF COMPLEX SYSTEMS USING AUTOMATA NETWORKS K. Ming Leung
POLYTECHNIC UNIVERSITY Department of Computer and Information Science COMPUTER SIMULATION OF COMPLEX SYSTEMS USING AUTOMATA NETWORKS K. Ming Leung Abstract: Computer simulation of the dynamics of complex
More informationLecture 20: Neural Networks for NLP. Zubin Pahuja
Lecture 20: Neural Networks for NLP Zubin Pahuja zpahuja2@illinois.edu courses.engr.illinois.edu/cs447 CS447: Natural Language Processing 1 Today s Lecture Feed-forward neural networks as classifiers simple
More informationA Comparative study of Clustering Algorithms using MapReduce in Hadoop
A Comparative study of Clustering Algorithms using MapReduce in Hadoop Dweepna Garg 1, Khushboo Trivedi 2, B.B.Panchal 3 1 Department of Computer Science and Engineering, Parul Institute of Engineering
More informationOptimization Methods for Machine Learning (OMML)
Optimization Methods for Machine Learning (OMML) 2nd lecture Prof. L. Palagi References: 1. Bishop Pattern Recognition and Machine Learning, Springer, 2006 (Chap 1) 2. V. Cherlassky, F. Mulier - Learning
More informationInstructor: Jessica Wu Harvey Mudd College
The Perceptron Instructor: Jessica Wu Harvey Mudd College The instructor gratefully acknowledges Andrew Ng (Stanford), Eric Eaton (UPenn), David Kauchak (Pomona), and the many others who made their course
More informationCHAPTER 6 IMPLEMENTATION OF RADIAL BASIS FUNCTION NEURAL NETWORK FOR STEGANALYSIS
95 CHAPTER 6 IMPLEMENTATION OF RADIAL BASIS FUNCTION NEURAL NETWORK FOR STEGANALYSIS 6.1 INTRODUCTION The concept of distance measure is used to associate the input and output pattern values. RBFs use
More information