Back propagation Algorithm:

Similar documents
CS 4510/9010 Applied Machine Learning. Neural Nets. Paula Matuszek Fall copyright Paula Matuszek 2016

Machine Learning 13. week

Data Mining. Neural Networks

CS 4510/9010 Applied Machine Learning. Deep Learning. Paula Matuszek Fall copyright Paula Matuszek 2016

CS 4510/9010 Applied Machine Learning

Practice Exam Sample Solutions

Neural Networks. CE-725: Statistical Pattern Recognition Sharif University of Technology Spring Soleymani

Lecture #11: The Perceptron

Supervised Learning in Neural Networks (Part 2)

Logical Rhythm - Class 3. August 27, 2018

INTRODUCTION TO DEEP LEARNING

CMPT 882 Week 3 Summary

Week 3: Perceptron and Multi-layer Perceptron

Neural Nets. CSCI 5582, Fall 2007

COMPUTATIONAL INTELLIGENCE

Lecture 2 Notes. Outline. Neural Networks. The Big Idea. Architecture. Instructors: Parth Shah, Riju Pahwa

Neural Network Approach for Automatic Landuse Classification of Satellite Images: One-Against-Rest and Multi-Class Classifiers

[February 1, 2017] Neural Networks for Dummies Rolf van Gelder, Eindhoven, NL

Traffic Signs Recognition using HP and HOG Descriptors Combined to MLP and SVM Classifiers

SEMANTIC COMPUTING. Lecture 8: Introduction to Deep Learning. TU Dresden, 7 December Dagmar Gromann International Center For Computational Logic

Assignment # 5. Farrukh Jabeen Due Date: November 2, Neural Networks: Backpropation

Neural Networks. Robot Image Credit: Viktoriya Sukhanova 123RF.com

Simulation of Back Propagation Neural Network for Iris Flower Classification

Reservoir Computing with Emphasis on Liquid State Machines

Object Detection Lecture Introduction to deep learning (CNN) Idar Dyrdal

Su et al. Shape Descriptors - III

Neural Networks Laboratory EE 329 A

Review on Methods of Selecting Number of Hidden Nodes in Artificial Neural Network

Artificial Neural Networks. Introduction to Computational Neuroscience Ardi Tampuu

Implementation of a Library for Artificial Neural Networks in C

Does the Brain do Inverse Graphics?

Notes on Multilayer, Feedforward Neural Networks

CMU Lecture 18: Deep learning and Vision: Convolutional neural networks. Teacher: Gianni A. Di Caro

ECG782: Multidimensional Digital Signal Processing

Climate Precipitation Prediction by Neural Network

Perceptrons and Backpropagation. Fabio Zachert Cognitive Modelling WiSe 2014/15

Visual object classification by sparse convolutional neural networks

Time Series prediction with Feed-Forward Neural Networks -A Beginners Guide and Tutorial for Neuroph. Laura E. Carter-Greaves

Linear Separability. Linear Separability. Capabilities of Threshold Neurons. Capabilities of Threshold Neurons. Capabilities of Threshold Neurons

Lecture 17: Neural Networks and Deep Learning. Instructor: Saravanan Thirumuruganathan

CS6220: DATA MINING TECHNIQUES

Report: Privacy-Preserving Classification on Deep Neural Network

Does the Brain do Inverse Graphics?

Deep neural networks II

International Journal of Advance Engineering and Research Development OPTICAL CHARACTER RECOGNITION USING ARTIFICIAL NEURAL NETWORK

Artificial Intelligence Introduction Handwriting Recognition Kadir Eren Unal ( ), Jakob Heyder ( )

Graph Neural Network. learning algorithm and applications. Shujia Zhang

! References: ! Computer eyesight gets a lot more accurate, NY Times. ! Stanford CS 231n. ! Christopher Olah s blog. ! Take ECS 174!

Neural Network and Deep Learning. Donglin Zeng, Department of Biostatistics, University of North Carolina

Deep Learning Basic Lecture - Complex Systems & Artificial Intelligence 2017/18 (VO) Asan Agibetov, PhD.

DEEP NEURAL NETWORKS FOR OBJECT DETECTION

Opening the Black Box Data Driven Visualizaion of Neural N

TRAFFIC SIGN CLASSIFICATION USING CONVOLUTIONAL NEURAL NETWORK

A *69>H>N6 #DJGC6A DG C<>C::G>C<,8>:C8:H /DA 'D 2:6G, ()-"&"3 -"(' ( +-" " " % '.+ % ' -0(+$,

Keywords: ANN; network topology; bathymetric model; representability.

Keras: Handwritten Digit Recognition using MNIST Dataset

Code Mania Artificial Intelligence: a. Module - 1: Introduction to Artificial intelligence and Python:

CP365 Artificial Intelligence

Recognition of Handwritten Digits using Machine Learning Techniques

Character Recognition Using Convolutional Neural Networks

Dynamic Routing Between Capsules

6. NEURAL NETWORK BASED PATH PLANNING ALGORITHM 6.1 INTRODUCTION

Artificial neural networks are the paradigm of connectionist systems (connectionism vs. symbolism)

International Journal of Advanced Research in Computer Science and Software Engineering

Neuron Selectivity as a Biologically Plausible Alternative to Backpropagation

The Mathematics Behind Neural Networks

Introduction to Deep Learning

Neural Networks. Single-layer neural network. CSE 446: Machine Learning Emily Fox University of Washington March 10, /10/2017

AMOL MUKUND LONDHE, DR.CHELPA LINGAM

LECTURE NOTES Professor Anita Wasilewska NEURAL NETWORKS

Natural Language Processing CS 6320 Lecture 6 Neural Language Models. Instructor: Sanda Harabagiu

Neural Networks CMSC475/675

Neural Network Classifier for Isolated Character Recognition

Deep Convolutional Neural Networks. Nov. 20th, 2015 Bruce Draper

Image Compression: An Artificial Neural Network Approach

Facial Keypoint Detection

CS231A Course Project Final Report Sign Language Recognition with Unsupervised Feature Learning

Learning. Learning agents Inductive learning. Neural Networks. Different Learning Scenarios Evaluation

Neural Networks: What can a network represent. Deep Learning, Fall 2018

Neural Networks: What can a network represent. Deep Learning, Spring 2018

2. Neural network basics

Artificial Neuron Modelling Based on Wave Shape

MATLAB representation of neural network Outline Neural network with single-layer of neurons. Neural network with multiple-layer of neurons.

Practical Tips for using Backpropagation

DEEP LEARNING REVIEW. Yann LeCun, Yoshua Bengio & Geoffrey Hinton Nature Presented by Divya Chitimalla

CS 523: Multimedia Systems

ConvolutionalNN's... ConvNet's... deep learnig

Lecture 20: Neural Networks for NLP. Zubin Pahuja

An Exploration of Computer Vision Techniques for Bird Species Classification

Neural Networks (Overview) Prof. Richard Zanibbi

Neural Network Neurons

Introduction to Multilayer Perceptrons

Deep Tracking: Biologically Inspired Tracking with Deep Convolutional Networks

More Learning. Ensembles Bayes Rule Neural Nets K-means Clustering EM Clustering WEKA

An experimental study of human action recognition system using machine learning

A System for Joining and Recognition of Broken Bangla Numerals for Indian Postal Automation

A Study on the Neural Network Model for Finger Print Recognition

Deep Learning. Deep Learning provided breakthrough results in speech recognition and image classification. Why?

CS230: Lecture 3 Various Deep Learning Topics

Application of Deep Learning Techniques in Satellite Telemetry Analysis.

Transcription:

Network Neural: A neural network is a class of computing system. They are created from very simple processing nodes formed into a network. They are inspired by the way that biological systems such as the brain work.they are fundamentally pattern recognition systems and tend to be more useful for tasks which can be described in terms of pattern recognition. They are 'trained' by feeding them with datasets (images, raw data) with known outputs. As an example imagine that you are trying to train a network to output a 1 when it is given a picture of a cat and a 0 when it sees a picture that is not a cat. You would train the network by running lots of pictures of cats through it and using an algorithm to tweak the network parameters until it gave the correct response. The parameters are usually a gain on each input and a weight on each node as well as the actual structure of the network (how many nodes, in how many layers, with what interconnections). Recognizing cat pictures is actually a quite complex problem and would require a complex neural network (possibly starting with one node per pixel). A usual starting point for experimenting with neural networks is to try and implement simple logic gates, such as AND, OR, NOT etc. as neural nets. Neural networks can be a very fast way of achieving a complex result. They are very interesting for AI research because they are a model for the animal brain. One of the major disadvantages of neural networks is that it is very hard to reverse engineer them. If your network decides one particular image of an elephant is actually a cat you can't really determine 'why' in any useful sense. All you can really do is try training/tweaking the network further. Neural networks tend to be used for well-bounded tasks such as coin/note recognition in vending machines, or defect spotting on production lines. But Nowadays NN are widely used in combination with Computer Vision applications for recognition of wide verity of visual data [1] Computer vision and NN Computer Vision has become ubiquitous in our society, with applications in search, image understanding, apps, mapping, medicine, drones, and self-driving cars. Core to many of these applications are visual recognition tasks such as image classification, localization and detection. Recent developments in neural network (aka deep learning ) approaches have greatly advanced the performance of these state-of-the-art visual recognition systems For example: we re looking at a can of Coke.[2]

The human sense of vision is unbelievably advanced. Within fractions of seconds, we can identify objects within our field of view, without thought or hesitation. But not only can we name objects we are looking at, we can also perceive their depth, perfectly distinguish their contours, and separate the objects from their backgrounds. Somehow our eyes take in raw voxels of color data, but our brain transforms that information into more meaningful primitives lines, curves, and shapes that might indicate the targeted objects it is a natural integration of computer vision and neural network [3] Back propagation Algorithm: It is often abbreviated as Back-Prop is one of the several ways in which an artificial neural network (ANN) can be trained. It is a supervised training scheme, which means, it learns from labeled training data (there is a supervisor, to guide its learning). To put in simple terms, Back-Prop is like "learning from mistakes". The

supervisor corrects the ANN whenever it makes mistakes. An ANN consists of nodes in different layers; input layer, intermediate hidden layer(s) and the output layer. The connections between nodes of adjacent layers have "weights" associated with them. The goal of learning is to assign correct weights for these edges. Given an input vector, these weights determine what the output vector is. In supervised learning, the training set is labeled e.g labeled images. This means, for some given inputs, we know (label) the desired/expected output. Back-Prop Algorithm: Initially all the edge weights are randomly assigned. For every input in the training dataset, the ANN is activated and its output is observed. This output is compared with the desired output that we already know, and the error is "propagated" back to the previous layer. This error is noted and the weights are "adjusted" accordingly. This process is repeated until the output error is below a predetermined threshold. Once the above algorithm terminates, we have a "learned" ANN which, we consider is ready to work with "new" inputs. This ANN is said to have learned from several examples (labeled data) and from its mistakes (error propagation) [4] Blue print of AAN use in our project (link):

Using neural network AND self-learning: Besides above discussed advantages of NN One more advantage of using neural network in our project is that once the network is trained, it only needs to load trained parameters afterwards, thus prediction can be very fast. Only lower half of the input image is used for training and prediction purposes. Artificial neural networks a basically crude electronic models which are based on the structure of neurons of a brain. ANNs do a promising job in promising patters, processing images, and other dynamic tasks with a much higher success rate. A multilayer perceptron neural network is a feedforward artificial neural network model that directs a set of input data onto a manually trained set of appropriate outputs. An MLP is basically a directed graph consisting of multiple layers of nodes. Every node other than the input node is a neuron and each node has a nonlinear activation function. Backpropagation is a supervised learning technique that is specifically used for training the network. MLP is basically a modification of standard linear perceptron and can distinguish data that is not linearly separable. For the proposed system a three level MLP network is being used with an input layer (with 38,400 inputs) a hidden layer (with 32 inputs) and an output layer (with 4 outputs). The number of nodes in the hidden layer is an arbitrary number, the number of inputs is 320*120 i.e. the number of probable inputs. The number of outputs is 4, one for each direction of movement. Prediction and steering Collection of training data is a step by step process. A set of progressive frames are captured and converted to an n/p array. This array is then paired with a label which is basically the human input and all these images are then saved into a lightweight database. The neural network is trained in Open CV using the back propagation method. Once the

training is completed the weights for each input node (labels for the captured images previously) is stored into an XML file. To see the predictions work, the neural network is loaded with this XML file. Refrences: [1] By Luke Graham What is a Neural Network in simple words ; softwareengineering.stackexchange.com [2] Neural Networks for Visual Recognition; CS231n.stanford.edu [3] Inside Deep Learning: Computer Vision With Convolutional Neural Networks by By Nikhil Buduma (MIT). [4] Hemanth Kumar Mantri, Software Engineer, Texas Ex; quora.com/how-do-you-explainback-propagation-algorithm-to-a-beginner-in-neural-network List of short forms used: ANN (artificial neural network) CV (computer vision) SL (self-learning) AI (artificial intelligence)

List of figures Natural recognition of object Natural Neural network working Back propagation algorithm working Layered structure of NN Labelling of frames