NEURAL NETWORKS. Typeset by FoilTEX 1

Similar documents
COMPUTATIONAL INTELLIGENCE

Neural Networks CMSC475/675

Supervised Learning (contd) Linear Separation. Mausam (based on slides by UW-AI faculty)

11/14/2010 Intelligent Systems and Soft Computing 1

Data Mining. Neural Networks

Dr. Qadri Hamarsheh Supervised Learning in Neural Networks (Part 1) learning algorithm Δwkj wkj Theoretically practically

For Monday. Read chapter 18, sections Homework:

Artificial Neural Networks

Neural Networks. CE-725: Statistical Pattern Recognition Sharif University of Technology Spring Soleymani

Support Vector Machines

Learning. Learning agents Inductive learning. Neural Networks. Different Learning Scenarios Evaluation

Neural Networks. Robot Image Credit: Viktoriya Sukhanova 123RF.com

CMPT 882 Week 3 Summary

Introduction to Neural Networks

Instructor: Jessica Wu Harvey Mudd College

11/14/2010 Intelligent Systems and Soft Computing 1

6. Linear Discriminant Functions

Support Vector Machines

Assignment # 5. Farrukh Jabeen Due Date: November 2, Neural Networks: Backpropation

Supervised Learning in Neural Networks (Part 2)

Introduction to Neural Networks. Non-linear models The perceptron. Introduction. Initialization. Last time. Simulation examples

Artificial neural networks are the paradigm of connectionist systems (connectionism vs. symbolism)

Learning and Generalization in Single Layer Perceptrons

Neural Networks (Overview) Prof. Richard Zanibbi

6. NEURAL NETWORK BASED PATH PLANNING ALGORITHM 6.1 INTRODUCTION

We looked at linear recall models which used memories stored with a Hebbian learning rule.

Biologically inspired object categorization in cluttered scenes

Website: HOPEFIELD NETWORK. Inderjeet Singh Behl, Ankush Saini, Jaideep Verma. ID-

Supervised Learning in Multilayer Spiking Neural

Character Recognition Using Convolutional Neural Networks

Machine Learning in Biology

Introduction AL Neuronale Netzwerke. VL Algorithmisches Lernen, Teil 2b. Norman Hendrich

Logical Rhythm - Class 3. August 27, 2018

Introduction to Neural Networks U. Minn. Psy Lecture 10. Non-linear models The perceptron. Introduction. Initialization.

Linear Classification and Perceptron

Introduction to Machine Learning

Design and Performance Analysis of and Gate using Synaptic Inputs for Neural Network Application

CS311 - Neural Nets Lab Thursday, April 11

Lecture #11: The Perceptron

Function approximation using RBF network. 10 basis functions and 25 data points.

Neurosim User s Guide

In this assignment, we investigated the use of neural networks for supervised classification

Technical University of Munich. Exercise 7: Neural Network Basics

The Perceptron. Simon Šuster, University of Groningen. Course Learning from data November 18, 2013

Supervised Learning with Neural Networks. We now look at how an agent might learn to solve a general problem by seeing examples.

Pattern Classification Algorithms for Face Recognition

1-Nearest Neighbor Boundary

Investigating topographic neural map development of the visualmay system 10, / 59

Classification Lecture Notes cse352. Neural Networks. Professor Anita Wasilewska

Neuron 1 Documentation. Program version 0.57 (1.0) Last updated May

Corn pet it ive Learning

Week 3: Perceptron and Multi-layer Perceptron

Chapter 5 Neural Network Concepts and Paradigms

6.034 Quiz 2, Spring 2005

5 Learning hypothesis classes (16 points)

Multiple-Choice Questionnaire Group C

Support Vector Machines

IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS 1

Neuro-Fuzzy Computing

Cryptography Algorithms using Artificial Neural Network Arijit Ghosh 1 Department of Computer Science St. Xavier s College (Autonomous) Kolkata India

Multilayer Feed-forward networks

Lecture 20: Neural Networks for NLP. Zubin Pahuja

Topology User Manual

A modified and fast Perceptron learning rule and its use for Tag Recommendations in Social Bookmarking Systems

LOGISTIC REGRESSION FOR MULTIPLE CLASSES

Data Compression. The Encoder and PCA

Exercise: Training Simple MLP by Backpropagation. Using Netlab.

AM 221: Advanced Optimization Spring 2016

Linear Regression: One-Dimensional Case

Neural Computer Architectures

Reification of Boolean Logic

Introduction to visual computation and the primate visual system

CS 523: Multimedia Systems

6.867 Machine learning, lecture 1 (Jaakkola) 1

Lecture 3. Oct

Natural Language Processing CS 6320 Lecture 6 Neural Language Models. Instructor: Sanda Harabagiu

PROGRAMMING TUTORIAL 4: SYNAPSES IN BRIAN2

A Compensatory Wavelet Neuron Model

Neural Networks (pp )

Artificial Neural Networks. Introduction to Computational Neuroscience Ardi Tampuu

A Hardware/Software Framework for Real-time Spiking Systems

Instantaneously trained neural networks with complex inputs

Neural Networks. Neural Network. Neural Network. Neural Network 2/21/2008. Andrew Kusiak. Intelligent Systems Laboratory Seamans Center

Deep (1) Matthieu Cord LIP6 / UPMC Paris 6

Yuki Osada Andrew Cannon

Linear Separability. Linear Separability. Capabilities of Threshold Neurons. Capabilities of Threshold Neurons. Capabilities of Threshold Neurons

Support vector machines

Linear Regression Implementation

Linear Discriminant Functions: Gradient Descent and Perceptron Convergence

Optimization Methods for Machine Learning (OMML)

Lecture 3: Linear Classification

6. Backpropagation training 6.1 Background

Lecture 6: Genetic Algorithm. An Introduction to Meta-Heuristics, Produced by Qiangfu Zhao (Since 2012), All rights reserved

Thesis Institute of Microelectronics. TU-Berlin

.. Spring 2017 CSC 566 Advanced Data Mining Alexander Dekhtyar..

Slide07 Haykin Chapter 9: Self-Organizing Maps

Neural Networks. Single-layer neural network. CSE 446: Machine Learning Emily Fox University of Washington March 10, /10/2017

Classification: Linear Discriminant Functions

Machine Learning (CSE 446): Unsupervised Learning

Chapter 5 Components for Evolution of Modular Artificial Neural Networks

Transcription:

NEURAL NETWORKS Typeset by FoilTEX 1

Basic Concepts The McCulloch-Pitts model Hebb s rule Neural network: double dynamics. Pattern Formation and Pattern Recognition Neural network as an input-output device What is the Perceptron and how does it learn? What a Perceptron can and cannot do? Typeset by FoilTEX 2

The McCulloch-Pitts model a a set of weighted input an adder an activation function (here a threshold-dependent step function) can you imagine anything else? Limitations of the MCP neurons nonlinear summation spike train asynchronous update excitatory and inhibitory synapses feedback loops Typeset by FoilTEX 3

Hebb s rule d dt w ij(t) = F (a i, a j ) (1) where F is a functional, and a j and a i are presynaptic and postsynaptic activity functions (i.e., they may include activity levels over some period of time and not just the current activity values. To define specific learning rules, i.e., the form of F, a few points should be clarified. The simplest Hebbian learning rule can be formalized as: d dt w ij(t) = k a i (t) a j (t), k > 0 (2) This rule expresses the conjunction among pre- and postsynaptic elements (using neurobiological terminology) or associative conditioning (in psychological terms), by a simple product of the actual states of pre- and post- synaptic elements, a j (t) and a i (t). Typeset by FoilTEX 4

Neural network: double dynamics. Pattern Formation and Pattern Recognition Figure 1: Activation function Typeset by FoilTEX 5

W / Neural network: double dynamics. Pattern Formation and Pattern Recognition Networks and dynamics parameters ]^`_\acbed'^f_\acb XZY\[ QSRUT V output input DFEGIHJ K LMEAGNOE GIHJPJ e.g. "node dynamics" 0"13245 6 78290:132$45+;=<1>-2$4*5+;?@24A;CB 5 5 "edge dynamics"! "#$&%' ()$*+%,-$. Figure 2: Double dynamics Typeset by FoilTEX 6

Neural network: double dynamics. Pattern Formation and Pattern Recognition SELF ORGANIZATION: Development and Plastcity of Ordered Structures s ij j i retina How to generate topographic order? How topreserve replace it after partial lesions? topographic order optic tectum 0 0 ONTOGENY AND(!) PLASTICITY LGN m L i s R i LL RR LL RR LL RR LL RR cortex (layer IV) ocular dominance columns initial configuration: homogeneous, very small Figure 3: Spatial pattern formation Typeset by FoilTEX 7

Neural network as an input-output device Input vector: x Weights: W Output: y ; y(x, W) Targets: t Activation function: g( ) Error: E deviation between y and t Typeset by FoilTEX 8

STRUCTURE What is the Perceptron and how does it learn? Possible mistakes a neuron can do Error t k y k Perceptron learning rule (numbers of inputs and of neurons are not necessary the same) w ik w ik + η(t k y k )x i if x i = 0 only threshold is adjustable bias node ; x 0 = 1 w 0j : adjustable Typeset by FoilTEX 9

What is the Perceptron and how does it learn? The Perceptron learning algorithm Initialize the weights and threshold to small (positive and negative) random numbers Training For each input vector calculate the output Update the weights according to the Perceptron learning rule: w ik w ik + η(t k y k )x i Recall Compute the activation of each neuron similarly Typeset by FoilTEX 10

What a Perceptron can and cannot do? logical OR operation NN implementation IMPLEMENTATION: choose from the scripts of Chapter 3: http://seat.massey.ac.nz/personal/s.r.marsland/mlbook.html Typeset by FoilTEX 11

Linear Separability What a Perceptron can and cannot do? Perceptron generally tries to find a decision boundary or discriminant function specifically straight line in 2D, plane in 3D and hyperplane in higher dimensions more specifically: separating two groups of neurons - (i) should fire, (ii) should not fire x 1 w T = 0: boundary vector Typeset by FoilTEX 12

Linear Separability What a Perceptron can and cannot do? (Data, target) Perceptron tries to find a straight line if a straight line exists: linearly separable cases more than one output neuron several straight lines Typeset by FoilTEX 13

Exclusive Or (XOR) What a Perceptron can and cannot do? XOR is not linearly separable Perceptron algorithm does not converge Minsky and Papert -> no funding for NN research for 15 years Typeset by FoilTEX 14

Changing dimension! What a Perceptron can and cannot do? does not change the data when is looked at (x, y) just moves 0, 0) along a third dimension : General Message: linear separability is always possible after projecting the data into the correct set of dimensions kernel classifiers support vector machines Typeset by FoilTEX 15