Problem Set 2: From Perceptrons to Back-Propagation

Size: px
Start display at page:

Download "Problem Set 2: From Perceptrons to Back-Propagation"

Transcription

1 COMPUTER SCIENCE 397 (Spring Term 2005) Neural Networks & Graphical Models Prof. Levy Due Friday 29 April Problem Set 2: From Perceptrons to Back-Propagation 1 Reading Assignment: AIMA Ch Programming Assignment: More Matlab, More Perceptrons, and Back-Prop This assignment consists of two parts (1) enhancement of your Matlab percep learning function from the last assignment and (2) implementing back-propagation. Let s take these one at a time. 2.1 Enhancing the perceptron algorithm Computing RMS error Recall the error term D from the PERCEPTRON-LEARNING algorithm. This term expresses the Difference between the target output T and the actual output O, for each training pattern. Once this difference drops to zero (or close to zero), there is no need to continue training via the outer loop. Instead of looping some arbitrary number of times (like 1000), we can therefore stop looping when the error drops below a certain value, averaged over all the patterns. We don t really care whether the values of D are too big in the positive or negative direction, just that their magnitude is too big (for example, D j = 0.5 and D j = 0.5 are equally bad). Furthermore, we want to compute the average difference over all the patterns, to get an idea of how well our perceptron is working on the whole problem. That is why we took the average value of the absolute values D in the previous assignment. A more common way of expressing the average error is called the Root Mean Squared (RMS) Error, defined as p m E = ( Djk 2 )/(pm) j k where m is the number of outputs and p is the number of patterns, as before. 1 To compute the RMS error, you should set it to zero right before the pattern (j) loop, and add to it the sum over the square of D within the pattern loop: 1 Intuitively, D 2 j gets rid of the +/- sign on each D j, and the square-root function undoes the effect of squaring.

2 e(i) = e(i) + sum(d.* D); %.* means element-wise product Then right after the pattern loop you can divide this sum by the product p*m, and compute the square root of the resulting value using Matlab s built-in sqrt function. Once you have done this, you can change the outer for loop (for i = 1:1000) to an loop (for i=1:inf) and then halt this loop using a break statement right after you compute the RMS error E (if e(i) < 0.05, break, end). If you leave the final semicolon off the call to sqrt, you will be able to see the RMS error on each iteration of the learning algorithm. Note how few iterations it actually takes to learn the boolean OR and AND functions Scaling Up In the real world, we would probably use a built-in AND or OR function (or prefabricated circuit), instead of training a perceptron to learn such a simple task. Indeed, as its name suggests, the perceptron was designed to make decisions about percepts (images, sounds), just as human beings are able to do so effortlessly. For this part of the assignment, you will train your perceptron to classify bitmaps (images) of the digits zero through seven. First, download the Matlab file digits.mat from the class web page. This file contains the matrices dig0, dig1,..., dig7, containing bitmaps of the digits 0 through 7, as well as the vectors bin0, bin1,..., bin7, containing bit patterns for those numbers (0 = 000, 1 = 001,..., 7 = 111). To see the bitmaps more clearly, you can use Matlab s spy function; e.g., spy(dig3) will show you something that looks (roughly) like the digit 3. We need to put these patterns into one matrix containing all the digits and another containing all the binary targets. For the latter, we can just use Matlab s semicolon notation, as we did in the previous lab: >> bins = [bin0; bin1; bin2; bin3; bin4; bin5; bin6; bin7] For the bitmap images, we will have to turn each image into a vector, and then combine the vectors as above. We can do this using Matlab s reshape function, to turn each 5 4 matrix into a 1 20 vector: 2 >> digs = [reshape(dig0,1,20); reshape(dig1,1,20);...; reshape(dig7,1,20)] Now we can train our perceptron by simply typing w = percep(digs, bins), and test it by typing, e.g., ptest(w, reshape(dig0, 1, 20)). Finally, let s play a little with the graceful degradation feature of neural net models. Add a little noise to one of the digits by using the noise function, which you can download from the web page. This function adds a little noise, from 0 to 100 percent, to the bitmap: >> ptest(w, reshape(noise(dig3, 10), 1, 20)) How much noise is fatal to the perceptron s behavior? Test a few different numbers with different amounts of noise, and turn in a README with your results. 2 You might want to write a separate dig2mat function to replace the calls to reshape(dign, 1, 20.

3 2.2 Back-Propagation Implementing Back-Prop In this part of the lab we will use the formulas in the back-prop lecture to convert our percep program into a program for doing back-propagation. To get started, copy your enhanced percep.m to a new file bp.m. Then download the file bp.zip from the class web page, and unzip the file to get a bunch of small utility functions that I wrote to help you. Now you can start modifying bp.m to do back-prop. First, change the function header to reflect the new name, the two sets of weights returned (input hidden and hidden output), and the two additional input arguments (number of hidden units, learning rate) required by the back-prop algorithm: function [wih, who] = bp(i, T, nh, eta) Next, change the names of the variables encoding the number of input and output units to something more meaningful, and use these variables to create the initial random weight matrices (including bias weights): 3 ni = size(i, 2); no = size(t, 2); wih = randn(ni+1, nh); who = randn(nh+1, no); You should remove the line I(:,end+1) = 1;, because the attachment of biases is done already in the forward and update utility functions. The remainder of the program can be written using the comments on the next page as a guideline. Each comment corresponds to a call (or two) of one of the utility routines in bp.zip. Looking at the routines and the class notes on back-prop should help you figure out how to call the routines. You should also write a test program bptest.m, which will accept the weights (wih and who) computed by bp, as well as the input I, and will compute and show the resulting outputs (by leaving out a final semicolon from the line in which they are computed). Train and test your perceptron on the XOR function (b = [0; 1; 1; 0]), and experiment with a few different learning rates and numbers of hidden units. Turn in your bp.m and bptest.m, and some plots of your errors over training epochs, as you did in the previous section. You can use Matlab s title command to annotate each plot with these values; e.g., title( NH=2 Eta=0.3 ). 3 Be sure to change these variables everywhere they appear, by using global search-and-replace.

4 % for each pattern 1 <= j <= p do for j = 1:p end... % forward pass % error on output % delta on output % backprop to get error on hidden % delta on hidden % delta rule for weight update % report RMS error every 1000 iterations if mod(i,1000) == 0 fprintf( %d: %4.4f\n, i, E) end % update weights Speedup and Generalization Once you have successfully implemented backprop, try speeding it up by adding a momentum term, as described in the lecture notes. The momentum coefficient µ (mu) should be passed in as the fifth argument to your bp function. Run your modified back-prop algorithm a few times on the XOR problem, using µ = 0.9, to convince yourself that momentum can dramatically speed up convergence. Note down the approximate average difference in training time in your README file. Next, we ll experiment a bit with the tradeoff between learning speed and generalization. Download the file alpha.mat from the class web page, and load the data contained in it by typing >> load alpha in the Matlab interpreter. The data consist of bitmaps for the letters A and B, as well as training matrices containing these bitmaps, and binary codes for each bitmap (A = 01; B = 10). (You can see the bitmap patterns using spy(a) and spy(b).) Perform the following simple experiment: Train a hidden-layer net using your bp program on this data, with a small number of hidden units (e.g., two). Test this small network on the training data with 25 percent noise added, as you did for the digit-recognizing perceptron above. Treat outputs above 0.5 as 1, and outputs below 0.5

5 as 0 4., and record the number of times, out of ten tests, that the network correctly classifies both letters. Now train a much larger network (e.g., 40 hidden units) on the same task, and test it the same way. How do your two networks compare on this version of generalization? Put your results in your README file, along with a brief explanation of any difference you note. 4 You can have Matlab do this for you: bptest(wih, who, noise(25,alpha)) >.5

CPSC : Program 3, Perceptron and Backpropagation

CPSC : Program 3, Perceptron and Backpropagation CPSC 420-500: Program 3, Perceptron and Backpropagation Yoonsuck Choe Department of Computer Science Texas A&M University November 2, 2007 1 Overview You will implement perceptron learning from scratch

More information

CPSC : Program 3, Perceptron and Backpropagation

CPSC : Program 3, Perceptron and Backpropagation CPSC 420-500: Program 3, Perceptron and Backpropagation Yoonsuck Choe Department of Computer Science Texas A&M University October 31, 2008 1 Overview You will implement perceptron learning from scratch

More information

Programming Exercise 4: Neural Networks Learning

Programming Exercise 4: Neural Networks Learning Programming Exercise 4: Neural Networks Learning Machine Learning Introduction In this exercise, you will implement the backpropagation algorithm for neural networks and apply it to the task of hand-written

More information

Extreme Learning Machines. Tony Oakden ANU AI Masters Project (early Presentation) 4/8/2014

Extreme Learning Machines. Tony Oakden ANU AI Masters Project (early Presentation) 4/8/2014 Extreme Learning Machines Tony Oakden ANU AI Masters Project (early Presentation) 4/8/2014 This presentation covers: Revision of Neural Network theory Introduction to Extreme Learning Machines ELM Early

More information

Deep Learning for Visual Computing Prof. Debdoot Sheet Department of Electrical Engineering Indian Institute of Technology, Kharagpur

Deep Learning for Visual Computing Prof. Debdoot Sheet Department of Electrical Engineering Indian Institute of Technology, Kharagpur Deep Learning for Visual Computing Prof. Debdoot Sheet Department of Electrical Engineering Indian Institute of Technology, Kharagpur Lecture - 05 Classification with Perceptron Model So, welcome to today

More information

Neural Network Learning. Today s Lecture. Continuation of Neural Networks. Artificial Neural Networks. Lecture 24: Learning 3. Victor R.

Neural Network Learning. Today s Lecture. Continuation of Neural Networks. Artificial Neural Networks. Lecture 24: Learning 3. Victor R. Lecture 24: Learning 3 Victor R. Lesser CMPSCI 683 Fall 2010 Today s Lecture Continuation of Neural Networks Artificial Neural Networks Compose of nodes/units connected by links Each link has a numeric

More information

CS 4510/9010 Applied Machine Learning

CS 4510/9010 Applied Machine Learning CS 4510/9010 Applied Machine Learning Neural Nets Paula Matuszek Spring, 2015 1 Neural Nets, the very short version A neural net consists of layers of nodes, or neurons, each of which has an activation

More information

Introduction to Matlab

Introduction to Matlab Introduction to Matlab Andreas C. Kapourani (Credit: Steve Renals & Iain Murray) 9 January 08 Introduction MATLAB is a programming language that grew out of the need to process matrices. It is used extensively

More information

Neural Networks (pp )

Neural Networks (pp ) Notation: Means pencil-and-paper QUIZ Means coding QUIZ Neural Networks (pp. 106-121) The first artificial neural network (ANN) was the (single-layer) perceptron, a simplified model of a biological neuron.

More information

CSE 547: Machine Learning for Big Data Spring Problem Set 2. Please read the homework submission policies.

CSE 547: Machine Learning for Big Data Spring Problem Set 2. Please read the homework submission policies. CSE 547: Machine Learning for Big Data Spring 2019 Problem Set 2 Please read the homework submission policies. 1 Principal Component Analysis and Reconstruction (25 points) Let s do PCA and reconstruct

More information

Introduction to MATLAB

Introduction to MATLAB Introduction to MATLAB The Desktop When you start MATLAB, the desktop appears, containing tools (graphical user interfaces) for managing files, variables, and applications associated with MATLAB. The following

More information

Fall 09, Homework 5

Fall 09, Homework 5 5-38 Fall 09, Homework 5 Due: Wednesday, November 8th, beginning of the class You can work in a group of up to two people. This group does not need to be the same group as for the other homeworks. You

More information

For Monday. Read chapter 18, sections Homework:

For Monday. Read chapter 18, sections Homework: For Monday Read chapter 18, sections 10-12 The material in section 8 and 9 is interesting, but we won t take time to cover it this semester Homework: Chapter 18, exercise 25 a-b Program 4 Model Neuron

More information

EE3TP4: Signals and Systems Lab 1: Introduction to Matlab Tim Davidson Ext Objective. Report. Introduction to Matlab

EE3TP4: Signals and Systems Lab 1: Introduction to Matlab Tim Davidson Ext Objective. Report. Introduction to Matlab EE3TP4: Signals and Systems Lab 1: Introduction to Matlab Tim Davidson Ext. 27352 davidson@mcmaster.ca Objective To help you familiarize yourselves with Matlab as a computation and visualization tool in

More information

Programming Exercise 3: Multi-class Classification and Neural Networks

Programming Exercise 3: Multi-class Classification and Neural Networks Programming Exercise 3: Multi-class Classification and Neural Networks Machine Learning Introduction In this exercise, you will implement one-vs-all logistic regression and neural networks to recognize

More information

Chapter 4: Data Representations

Chapter 4: Data Representations Chapter 4: Data Representations Integer Representations o unsigned o sign-magnitude o one's complement o two's complement o bias o comparison o sign extension o overflow Character Representations Floating

More information

Week 3: Perceptron and Multi-layer Perceptron

Week 3: Perceptron and Multi-layer Perceptron Week 3: Perceptron and Multi-layer Perceptron Phong Le, Willem Zuidema November 12, 2013 Last week we studied two famous biological neuron models, Fitzhugh-Nagumo model and Izhikevich model. This week,

More information

Natural Language Processing with Deep Learning CS224N/Ling284. Christopher Manning Lecture 4: Backpropagation and computation graphs

Natural Language Processing with Deep Learning CS224N/Ling284. Christopher Manning Lecture 4: Backpropagation and computation graphs Natural Language Processing with Deep Learning CS4N/Ling84 Christopher Manning Lecture 4: Backpropagation and computation graphs Lecture Plan Lecture 4: Backpropagation and computation graphs 1. Matrix

More information

CS311 - Neural Nets Lab Thursday, April 11

CS311 - Neural Nets Lab Thursday, April 11 CS311 - Neural Nets Lab Thursday, April 11 In class today we re going to be playing with a software that simulates neural networks to get a better feeling for how they work, how they can be used to solve

More information

Step by step set of instructions to accomplish a task or solve a problem

Step by step set of instructions to accomplish a task or solve a problem Step by step set of instructions to accomplish a task or solve a problem Algorithm to sum a list of numbers: Start a Sum at 0 For each number in the list: Add the current sum to the next number Make the

More information

Lecture 17: Neural Networks and Deep Learning. Instructor: Saravanan Thirumuruganathan

Lecture 17: Neural Networks and Deep Learning. Instructor: Saravanan Thirumuruganathan Lecture 17: Neural Networks and Deep Learning Instructor: Saravanan Thirumuruganathan Outline Perceptron Neural Networks Deep Learning Convolutional Neural Networks Recurrent Neural Networks Auto Encoders

More information

UNCA CSCI 255 Exam 1 Spring February, This is a closed book and closed notes exam. It is to be turned in by 1:45 PM.

UNCA CSCI 255 Exam 1 Spring February, This is a closed book and closed notes exam. It is to be turned in by 1:45 PM. UNCA CSCI 255 Exam 1 Spring 2017 27 February, 2017 This is a closed book and closed notes exam. It is to be turned in by 1:45 PM. Communication with anyone other than the instructor is not allowed during

More information

Lecture 3: Linear Classification

Lecture 3: Linear Classification Lecture 3: Linear Classification Roger Grosse 1 Introduction Last week, we saw an example of a learning task called regression. There, the goal was to predict a scalar-valued target from a set of features.

More information

Simple Model Selection Cross Validation Regularization Neural Networks

Simple Model Selection Cross Validation Regularization Neural Networks Neural Nets: Many possible refs e.g., Mitchell Chapter 4 Simple Model Selection Cross Validation Regularization Neural Networks Machine Learning 10701/15781 Carlos Guestrin Carnegie Mellon University February

More information

Multi-Layered Perceptrons (MLPs)

Multi-Layered Perceptrons (MLPs) Multi-Layered Perceptrons (MLPs) The XOR problem is solvable if we add an extra node to a Perceptron A set of weights can be found for the above 5 connections which will enable the XOR of the inputs to

More information

University of Alberta

University of Alberta A Brief Introduction to MATLAB University of Alberta M.G. Lipsett 2008 MATLAB is an interactive program for numerical computation and data visualization, used extensively by engineers for analysis of systems.

More information

Exercise 2: Hopeld Networks

Exercise 2: Hopeld Networks Articiella neuronnät och andra lärande system, 2D1432, 2004 Exercise 2: Hopeld Networks [Last examination date: Friday 2004-02-13] 1 Objectives This exercise is about recurrent networks, especially the

More information

Multilayer Feed-forward networks

Multilayer Feed-forward networks Multi Feed-forward networks 1. Computational models of McCulloch and Pitts proposed a binary threshold unit as a computational model for artificial neuron. This first type of neuron has been generalized

More information

Multinomial Regression and the Softmax Activation Function. Gary Cottrell!

Multinomial Regression and the Softmax Activation Function. Gary Cottrell! Multinomial Regression and the Softmax Activation Function Gary Cottrell Notation reminder We have N data points, or patterns, in the training set, with the pattern number as a superscript: {(x 1,t 1 ),

More information

CIS 520, Machine Learning, Fall 2015: Assignment 7 Due: Mon, Nov 16, :59pm, PDF to Canvas [100 points]

CIS 520, Machine Learning, Fall 2015: Assignment 7 Due: Mon, Nov 16, :59pm, PDF to Canvas [100 points] CIS 520, Machine Learning, Fall 2015: Assignment 7 Due: Mon, Nov 16, 2015. 11:59pm, PDF to Canvas [100 points] Instructions. Please write up your responses to the following problems clearly and concisely.

More information

THE NEURAL NETWORKS: APPLICATION AND OPTIMIZATION APPLICATION OF LEVENBERG-MARQUARDT ALGORITHM FOR TIFINAGH CHARACTER RECOGNITION

THE NEURAL NETWORKS: APPLICATION AND OPTIMIZATION APPLICATION OF LEVENBERG-MARQUARDT ALGORITHM FOR TIFINAGH CHARACTER RECOGNITION International Journal of Science, Environment and Technology, Vol. 2, No 5, 2013, 779 786 ISSN 2278-3687 (O) THE NEURAL NETWORKS: APPLICATION AND OPTIMIZATION APPLICATION OF LEVENBERG-MARQUARDT ALGORITHM

More information

Online Algorithm Comparison points

Online Algorithm Comparison points CS446: Machine Learning Spring 2017 Problem Set 3 Handed Out: February 15 th, 2017 Due: February 27 th, 2017 Feel free to talk to other members of the class in doing the homework. I am more concerned that

More information

Lecture 6: Arithmetic and Threshold Circuits

Lecture 6: Arithmetic and Threshold Circuits IAS/PCMI Summer Session 2000 Clay Mathematics Undergraduate Program Advanced Course on Computational Complexity Lecture 6: Arithmetic and Threshold Circuits David Mix Barrington and Alexis Maciel July

More information

MATH (CRN 13695) Lab 1: Basics for Linear Algebra and Matlab

MATH (CRN 13695) Lab 1: Basics for Linear Algebra and Matlab MATH 495.3 (CRN 13695) Lab 1: Basics for Linear Algebra and Matlab Below is a screen similar to what you should see when you open Matlab. The command window is the large box to the right containing the

More information

CS420 Project IV. Experimentation with Artificial Neural Network Architectures. Alexander Saites

CS420 Project IV. Experimentation with Artificial Neural Network Architectures. Alexander Saites CS420 Project IV Experimentation with Artificial Neural Network Architectures Alexander Saites 3/26/2012 Alexander Saites 1 Introduction In this project, I implemented an artificial neural network that

More information

General Information. There are certain MATLAB features you should be aware of before you begin working with MATLAB.

General Information. There are certain MATLAB features you should be aware of before you begin working with MATLAB. Introduction to MATLAB 1 General Information Once you initiate the MATLAB software, you will see the MATLAB logo appear and then the MATLAB prompt >>. The prompt >> indicates that MATLAB is awaiting a

More information

Partitioning Data. IRDS: Evaluation, Debugging, and Diagnostics. Cross-Validation. Cross-Validation for parameter tuning

Partitioning Data. IRDS: Evaluation, Debugging, and Diagnostics. Cross-Validation. Cross-Validation for parameter tuning Partitioning Data IRDS: Evaluation, Debugging, and Diagnostics Charles Sutton University of Edinburgh Training Validation Test Training : Running learning algorithms Validation : Tuning parameters of learning

More information

Neural Network Neurons

Neural Network Neurons Neural Networks Neural Network Neurons 1 Receives n inputs (plus a bias term) Multiplies each input by its weight Applies activation function to the sum of results Outputs result Activation Functions Given

More information

Neural Nets. General Model Building

Neural Nets. General Model Building Neural Nets To give you an idea of how new this material is, let s do a little history lesson. The origins of neural nets are typically dated back to the early 1940 s and work by two physiologists, McCulloch

More information

Learning internal representations

Learning internal representations CHAPTER 4 Learning internal representations Introduction In the previous chapter, you trained a single-layered perceptron on the problems AND and OR using the delta rule. This architecture was incapable

More information

An Algorithm For Training Multilayer Perceptron (MLP) For Image Reconstruction Using Neural Network Without Overfitting.

An Algorithm For Training Multilayer Perceptron (MLP) For Image Reconstruction Using Neural Network Without Overfitting. An Algorithm For Training Multilayer Perceptron (MLP) For Image Reconstruction Using Neural Network Without Overfitting. Mohammad Mahmudul Alam Mia, Shovasis Kumar Biswas, Monalisa Chowdhury Urmi, Abubakar

More information

IEEE Floating Point Numbers Overview

IEEE Floating Point Numbers Overview COMP 40: Machine Structure and Assembly Language Programming (Fall 2015) IEEE Floating Point Numbers Overview Noah Mendelsohn Tufts University Email: noah@cs.tufts.edu Web: http://www.cs.tufts.edu/~noah

More information

Improving the way neural networks learn Srikumar Ramalingam School of Computing University of Utah

Improving the way neural networks learn Srikumar Ramalingam School of Computing University of Utah Improving the way neural networks learn Srikumar Ramalingam School of Computing University of Utah Reference Most of the slides are taken from the third chapter of the online book by Michael Nielson: neuralnetworksanddeeplearning.com

More information

CS6220: DATA MINING TECHNIQUES

CS6220: DATA MINING TECHNIQUES CS6220: DATA MINING TECHNIQUES Image Data: Classification via Neural Networks Instructor: Yizhou Sun yzsun@ccs.neu.edu November 19, 2015 Methods to Learn Classification Clustering Frequent Pattern Mining

More information

Natural Language Processing CS 6320 Lecture 6 Neural Language Models. Instructor: Sanda Harabagiu

Natural Language Processing CS 6320 Lecture 6 Neural Language Models. Instructor: Sanda Harabagiu Natural Language Processing CS 6320 Lecture 6 Neural Language Models Instructor: Sanda Harabagiu In this lecture We shall cover: Deep Neural Models for Natural Language Processing Introduce Feed Forward

More information

CPSC 340: Machine Learning and Data Mining. Principal Component Analysis Fall 2017

CPSC 340: Machine Learning and Data Mining. Principal Component Analysis Fall 2017 CPSC 340: Machine Learning and Data Mining Principal Component Analysis Fall 2017 Assignment 3: 2 late days to hand in tonight. Admin Assignment 4: Due Friday of next week. Last Time: MAP Estimation MAP

More information

The design recipe. Readings: HtDP, sections 1-5. (ordering of topics is different in lectures, different examples will be used)

The design recipe. Readings: HtDP, sections 1-5. (ordering of topics is different in lectures, different examples will be used) The design recipe Readings: HtDP, sections 1-5 (ordering of topics is different in lectures, different examples will be used) Survival and Style Guides CS 135 Winter 2018 02: The design recipe 1 Programs

More information

The Mathematics Behind Neural Networks

The Mathematics Behind Neural Networks The Mathematics Behind Neural Networks Pattern Recognition and Machine Learning by Christopher M. Bishop Student: Shivam Agrawal Mentor: Nathaniel Monson Courtesy of xkcd.com The Black Box Training the

More information

CSC 2515 Introduction to Machine Learning Assignment 2

CSC 2515 Introduction to Machine Learning Assignment 2 CSC 2515 Introduction to Machine Learning Assignment 2 Zhongtian Qiu(1002274530) Problem 1 See attached scan files for question 1. 2. Neural Network 2.1 Examine the statistics and plots of training error

More information

CPSC 340: Machine Learning and Data Mining. Logistic Regression Fall 2016

CPSC 340: Machine Learning and Data Mining. Logistic Regression Fall 2016 CPSC 340: Machine Learning and Data Mining Logistic Regression Fall 2016 Admin Assignment 1: Marks visible on UBC Connect. Assignment 2: Solution posted after class. Assignment 3: Due Wednesday (at any

More information

Supervised Learning in Neural Networks (Part 2)

Supervised Learning in Neural Networks (Part 2) Supervised Learning in Neural Networks (Part 2) Multilayer neural networks (back-propagation training algorithm) The input signals are propagated in a forward direction on a layer-bylayer basis. Learning

More information

Computational Photonics, Summer Term 2012, Abbe School of Photonics, FSU Jena, Prof. Thomas Pertsch

Computational Photonics, Summer Term 2012, Abbe School of Photonics, FSU Jena, Prof. Thomas Pertsch Computational Photonics Seminar 02, 30 April 2012 Programming in MATLAB controlling of a program s flow of execution branching loops loop control several programming tasks 1 Programming task 1 Plot the

More information

CS 101: Computer Programming and Utilization

CS 101: Computer Programming and Utilization CS 101: Computer Programming and Utilization Jul-Nov 2017 Umesh Bellur (cs101@cse.iitb.ac.in) Lecture 3: Number Representa.ons Representing Numbers Digital Circuits can store and manipulate 0 s and 1 s.

More information

LECTURE NOTES Professor Anita Wasilewska NEURAL NETWORKS

LECTURE NOTES Professor Anita Wasilewska NEURAL NETWORKS LECTURE NOTES Professor Anita Wasilewska NEURAL NETWORKS Neural Networks Classifier Introduction INPUT: classification data, i.e. it contains an classification (class) attribute. WE also say that the class

More information

Assignment # 5. Farrukh Jabeen Due Date: November 2, Neural Networks: Backpropation

Assignment # 5. Farrukh Jabeen Due Date: November 2, Neural Networks: Backpropation Farrukh Jabeen Due Date: November 2, 2009. Neural Networks: Backpropation Assignment # 5 The "Backpropagation" method is one of the most popular methods of "learning" by a neural network. Read the class

More information

Computational Optimization Homework 3

Computational Optimization Homework 3 Computational Optimization Homework 3 Nedialko B. Dimitrov By completing this homework assignment you will learn about: Organizing code by creating your own objects. Vectorizing complex operations, and

More information

Finite Element Analysis Prof. Dr. B. N. Rao Department of Civil Engineering Indian Institute of Technology, Madras. Lecture - 24

Finite Element Analysis Prof. Dr. B. N. Rao Department of Civil Engineering Indian Institute of Technology, Madras. Lecture - 24 Finite Element Analysis Prof. Dr. B. N. Rao Department of Civil Engineering Indian Institute of Technology, Madras Lecture - 24 So in today s class, we will look at quadrilateral elements; and we will

More information

Interactive MATLAB use. Often, many steps are needed. Automated data processing is common in Earth science! only good if problem is simple

Interactive MATLAB use. Often, many steps are needed. Automated data processing is common in Earth science! only good if problem is simple Chapter 2 Interactive MATLAB use only good if problem is simple Often, many steps are needed We also want to be able to automate repeated tasks Automated data processing is common in Earth science! Automated

More information

OUTLINES. Variable names in MATLAB. Matrices, Vectors and Scalar. Entering a vector Colon operator ( : ) Mathematical operations on vectors.

OUTLINES. Variable names in MATLAB. Matrices, Vectors and Scalar. Entering a vector Colon operator ( : ) Mathematical operations on vectors. 1 LECTURE 3 OUTLINES Variable names in MATLAB Examples Matrices, Vectors and Scalar Scalar Vectors Entering a vector Colon operator ( : ) Mathematical operations on vectors examples 2 VARIABLE NAMES IN

More information

Exercise: Training Simple MLP by Backpropagation. Using Netlab.

Exercise: Training Simple MLP by Backpropagation. Using Netlab. Exercise: Training Simple MLP by Backpropagation. Using Netlab. Petr Pošík December, 27 File list This document is an explanation text to the following script: demomlpklin.m script implementing the beckpropagation

More information

Homework 4: Clustering, Recommenders, Dim. Reduction, ML and Graph Mining (due November 19 th, 2014, 2:30pm, in class hard-copy please)

Homework 4: Clustering, Recommenders, Dim. Reduction, ML and Graph Mining (due November 19 th, 2014, 2:30pm, in class hard-copy please) Virginia Tech. Computer Science CS 5614 (Big) Data Management Systems Fall 2014, Prakash Homework 4: Clustering, Recommenders, Dim. Reduction, ML and Graph Mining (due November 19 th, 2014, 2:30pm, in

More information

Binary, Hexadecimal and Octal number system

Binary, Hexadecimal and Octal number system Binary, Hexadecimal and Octal number system Binary, hexadecimal, and octal refer to different number systems. The one that we typically use is called decimal. These number systems refer to the number of

More information

Constraint-based Metabolic Reconstructions & Analysis H. Scott Hinton. Matlab Tutorial. Lesson: Matlab Tutorial

Constraint-based Metabolic Reconstructions & Analysis H. Scott Hinton. Matlab Tutorial. Lesson: Matlab Tutorial 1 Matlab Tutorial 2 Lecture Learning Objectives Each student should be able to: Describe the Matlab desktop Explain the basic use of Matlab variables Explain the basic use of Matlab scripts Explain the

More information

More on Learning. Neural Nets Support Vectors Machines Unsupervised Learning (Clustering) K-Means Expectation-Maximization

More on Learning. Neural Nets Support Vectors Machines Unsupervised Learning (Clustering) K-Means Expectation-Maximization More on Learning Neural Nets Support Vectors Machines Unsupervised Learning (Clustering) K-Means Expectation-Maximization Neural Net Learning Motivated by studies of the brain. A network of artificial

More information

MATLAB - Lecture # 4

MATLAB - Lecture # 4 MATLAB - Lecture # 4 Script Files / Chapter 4 Topics Covered: 1. Script files. SCRIPT FILE 77-78! A script file is a sequence of MATLAB commands, called a program.! When a file runs, MATLAB executes the

More information

Learning and Generalization in Single Layer Perceptrons

Learning and Generalization in Single Layer Perceptrons Learning and Generalization in Single Layer Perceptrons Neural Computation : Lecture 4 John A. Bullinaria, 2015 1. What Can Perceptrons do? 2. Decision Boundaries The Two Dimensional Case 3. Decision Boundaries

More information

International Journal of Electrical and Computer Engineering 4: Application of Neural Network in User Authentication for Smart Home System

International Journal of Electrical and Computer Engineering 4: Application of Neural Network in User Authentication for Smart Home System Application of Neural Network in User Authentication for Smart Home System A. Joseph, D.B.L. Bong, and D.A.A. Mat Abstract Security has been an important issue and concern in the smart home systems. Smart

More information

A neural network that classifies glass either as window or non-window depending on the glass chemistry.

A neural network that classifies glass either as window or non-window depending on the glass chemistry. A neural network that classifies glass either as window or non-window depending on the glass chemistry. Djaber Maouche Department of Electrical Electronic Engineering Cukurova University Adana, Turkey

More information

CS251 Spring 2014 Lecture 7

CS251 Spring 2014 Lecture 7 CS251 Spring 2014 Lecture 7 Stephanie R Taylor Feb 19, 2014 1 Moving on to 3D Today, we move on to 3D coordinates. But first, let s recap of what we did in 2D: 1. We represented a data point in 2D data

More information

Lab copy. Do not remove! Mathematics 152 Spring 1999 Notes on the course calculator. 1. The calculator VC. The web page

Lab copy. Do not remove! Mathematics 152 Spring 1999 Notes on the course calculator. 1. The calculator VC. The web page Mathematics 152 Spring 1999 Notes on the course calculator 1. The calculator VC The web page http://gamba.math.ubc.ca/coursedoc/math152/docs/ca.html contains a generic version of the calculator VC and

More information

Appendix B: Using Graphical Analysis

Appendix B: Using Graphical Analysis Appendix B: Using Graphical Analysis Graphical Analysis (GA) is a program by Vernier Software. This program is loaded on all of the Physics 170A computers as well as all of the regular Physics 170 computers.

More information

CS1114 Assignment 5, Part 1

CS1114 Assignment 5, Part 1 CS4 Assignment 5, Part out: Friday, March 27, 2009. due: Friday, April 3, 2009, 5PM. This assignment covers three topics in two parts: interpolation and image transformations (Part ), and feature-based

More information

A Basic Guide to Using Matlab in Econ 201FS

A Basic Guide to Using Matlab in Econ 201FS A Basic Guide to Using Matlab in Econ 201FS Matthew Rognlie February 1, 2010 Contents 1 Finding Matlab 2 2 Getting Started 2 3 Basic Data Manipulation 3 4 Plotting and Finding Returns 4 4.1 Basic Price

More information

CS 4510/9010 Applied Machine Learning. Neural Nets. Paula Matuszek Fall copyright Paula Matuszek 2016

CS 4510/9010 Applied Machine Learning. Neural Nets. Paula Matuszek Fall copyright Paula Matuszek 2016 CS 4510/9010 Applied Machine Learning 1 Neural Nets Paula Matuszek Fall 2016 Neural Nets, the very short version 2 A neural net consists of layers of nodes, or neurons, each of which has an activation

More information

Lecture 13. Deep Belief Networks. Michael Picheny, Bhuvana Ramabhadran, Stanley F. Chen

Lecture 13. Deep Belief Networks. Michael Picheny, Bhuvana Ramabhadran, Stanley F. Chen Lecture 13 Deep Belief Networks Michael Picheny, Bhuvana Ramabhadran, Stanley F. Chen IBM T.J. Watson Research Center Yorktown Heights, New York, USA {picheny,bhuvana,stanchen}@us.ibm.com 12 December 2012

More information

Chapter 2 Bits, Data Types, and Operations

Chapter 2 Bits, Data Types, and Operations Chapter 2 Bits, Data Types, and Operations Original slides from Gregory Byrd, North Carolina State University Modified slides by Chris Wilcox, Colorado State University How do we represent data in a computer?!

More information

CS101 Lecture 04: Binary Arithmetic

CS101 Lecture 04: Binary Arithmetic CS101 Lecture 04: Binary Arithmetic Binary Number Addition Two s complement encoding Briefly: real number representation Aaron Stevens (azs@bu.edu) 25 January 2013 What You ll Learn Today Counting in binary

More information

CS547, Neural Networks: Homework 3

CS547, Neural Networks: Homework 3 CS57, Neural Networks: Homework 3 Christopher E. Davis - chrisd@cs.unm.edu University of New Mexico Theory Problem - Perceptron a Give a proof of the Perceptron Convergence Theorem anyway you can. Let

More information

CHAPTER VI BACK PROPAGATION ALGORITHM

CHAPTER VI BACK PROPAGATION ALGORITHM 6.1 Introduction CHAPTER VI BACK PROPAGATION ALGORITHM In the previous chapter, we analysed that multiple layer perceptrons are effectively applied to handle tricky problems if trained with a vastly accepted

More information

Neural Networks. Robot Image Credit: Viktoriya Sukhanova 123RF.com

Neural Networks. Robot Image Credit: Viktoriya Sukhanova 123RF.com Neural Networks These slides were assembled by Eric Eaton, with grateful acknowledgement of the many others who made their course materials freely available online. Feel free to reuse or adapt these slides

More information

Linear Classification and Perceptron

Linear Classification and Perceptron Linear Classification and Perceptron INFO-4604, Applied Machine Learning University of Colorado Boulder September 7, 2017 Prof. Michael Paul Prediction Functions Remember: a prediction function is the

More information

4. Write a sum-of-products representation of the following circuit. Y = (A + B + C) (A + B + C)

4. Write a sum-of-products representation of the following circuit. Y = (A + B + C) (A + B + C) COP 273, Winter 26 Exercises 2 - combinational logic Questions. How many boolean functions can be defined on n input variables? 2. Consider the function: Y = (A B) (A C) B (a) Draw a combinational logic

More information

Introduction to numerical algorithms

Introduction to numerical algorithms Introduction to numerical algorithms Given an algebraic equation or formula, we may want to approximate the value, and while in calculus, we deal with equations or formulas that are well defined at each

More information

Math Homework 3

Math Homework 3 Math 0 - Homework 3 Due: Friday Feb. in class. Write on your paper the lab section you have registered for.. Staple the sheets together.. Solve exercise 8. of the textbook : Consider the following data:

More information

Repetition Structures Chapter 9

Repetition Structures Chapter 9 Sum of the terms Repetition Structures Chapter 9 1 Value of the Alternating Harmonic Series 0.9 0.8 0.7 0.6 0.5 10 0 10 1 10 2 10 3 Number of terms Objectives After studying this chapter you should be

More information

Why MultiLayer Perceptron/Neural Network? Objective: Attributes:

Why MultiLayer Perceptron/Neural Network? Objective: Attributes: Why MultiLayer Perceptron/Neural Network? Neural networks, with their remarkable ability to derive meaning from complicated or imprecise data, can be used to extract patterns and detect trends that are

More information

Assignment: Backgrounding and Optical Flow.

Assignment: Backgrounding and Optical Flow. Assignment: Backgrounding and Optical Flow. April 6, 00 Backgrounding In this part of the assignment, you will develop a simple background subtraction program.. In this assignment, you are given two videos.

More information

COMP combinational logic 1 Jan. 18, 2016

COMP combinational logic 1 Jan. 18, 2016 In lectures 1 and 2, we looked at representations of numbers. For the case of integers, we saw that we could perform addition of two numbers using a binary representation and using the same algorithm that

More information

INTRODUCTION TO MACHINE LEARNING. Measuring model performance or error

INTRODUCTION TO MACHINE LEARNING. Measuring model performance or error INTRODUCTION TO MACHINE LEARNING Measuring model performance or error Is our model any good? Context of task Accuracy Computation time Interpretability 3 types of tasks Classification Regression Clustering

More information

SCHEME 8. 1 Introduction. 2 Primitives COMPUTER SCIENCE 61A. March 23, 2017

SCHEME 8. 1 Introduction. 2 Primitives COMPUTER SCIENCE 61A. March 23, 2017 SCHEME 8 COMPUTER SCIENCE 61A March 2, 2017 1 Introduction In the next part of the course, we will be working with the Scheme programming language. In addition to learning how to write Scheme programs,

More information

MatLab Tutorial. Draft. Anthony S. Maida. October 8, 2001; revised 9/20/04; 3/23/06; 12/11/06

MatLab Tutorial. Draft. Anthony S. Maida. October 8, 2001; revised 9/20/04; 3/23/06; 12/11/06 MatLab Tutorial Draft Anthony S. Maida October 8, 2001; revised 9/20/04; 3/23/06; 12/11/06 Contents 1 Introduction 2 1.1 Is MatLab appropriate for your problem?.................. 2 1.2 Interpreted language.............................

More information

Neural Networks. CE-725: Statistical Pattern Recognition Sharif University of Technology Spring Soleymani

Neural Networks. CE-725: Statistical Pattern Recognition Sharif University of Technology Spring Soleymani Neural Networks CE-725: Statistical Pattern Recognition Sharif University of Technology Spring 2013 Soleymani Outline Biological and artificial neural networks Feed-forward neural networks Single layer

More information

Lecture 2 Notes. Outline. Neural Networks. The Big Idea. Architecture. Instructors: Parth Shah, Riju Pahwa

Lecture 2 Notes. Outline. Neural Networks. The Big Idea. Architecture. Instructors: Parth Shah, Riju Pahwa Instructors: Parth Shah, Riju Pahwa Lecture 2 Notes Outline 1. Neural Networks The Big Idea Architecture SGD and Backpropagation 2. Convolutional Neural Networks Intuition Architecture 3. Recurrent Neural

More information

Knowledge Discovery and Data Mining. Neural Nets. A simple NN as a Mathematical Formula. Notes. Lecture 13 - Neural Nets. Tom Kelsey.

Knowledge Discovery and Data Mining. Neural Nets. A simple NN as a Mathematical Formula. Notes. Lecture 13 - Neural Nets. Tom Kelsey. Knowledge Discovery and Data Mining Lecture 13 - Neural Nets Tom Kelsey School of Computer Science University of St Andrews http://tom.home.cs.st-andrews.ac.uk twk@st-andrews.ac.uk Tom Kelsey ID5059-13-NN

More information

YOGYAKARTA STATE UNIVERSITY MATHEMATICS AND NATURAL SCIENCES FACULTY MATHEMATICS EDUCATION STUDY PROGRAM

YOGYAKARTA STATE UNIVERSITY MATHEMATICS AND NATURAL SCIENCES FACULTY MATHEMATICS EDUCATION STUDY PROGRAM YOGYAKARTA STATE UNIVERSITY MATHEMATICS AND NATURAL SCIENCES FACULTY MATHEMATICS EDUCATION STUDY PROGRAM TOPIC 1 INTRODUCING SOME MATHEMATICS SOFTWARE (Matlab, Maple and Mathematica) This topic provides

More information

CS 61C: Great Ideas in Computer Architecture Performance and Floating-Point Arithmetic

CS 61C: Great Ideas in Computer Architecture Performance and Floating-Point Arithmetic CS 61C: Great Ideas in Computer Architecture Performance and Floating-Point Arithmetic Instructors: Nick Weaver & John Wawrzynek http://inst.eecs.berkeley.edu/~cs61c/sp18 3/16/18 Spring 2018 Lecture #17

More information

(Refer Slide Time: 00:03:51)

(Refer Slide Time: 00:03:51) Computer Graphics Prof. Sukhendu Das Dept. of Computer Science and Engineering Indian Institute of Technology, Madras Lecture 17 Scan Converting Lines, Circles and Ellipses Hello and welcome everybody

More information

Knowledge Discovery and Data Mining

Knowledge Discovery and Data Mining Knowledge Discovery and Data Mining Lecture 13 - Neural Nets Tom Kelsey School of Computer Science University of St Andrews http://tom.home.cs.st-andrews.ac.uk twk@st-andrews.ac.uk Tom Kelsey ID5059-13-NN

More information

Programming Exercise 1: Linear Regression

Programming Exercise 1: Linear Regression Programming Exercise 1: Linear Regression Machine Learning Introduction In this exercise, you will implement linear regression and get to see it work on data. Before starting on this programming exercise,

More information

MS&E 226: Small Data

MS&E 226: Small Data MS&E 226: Small Data Lecture 14: Introduction to hypothesis testing (v2) Ramesh Johari ramesh.johari@stanford.edu 1 / 10 Hypotheses 2 / 10 Quantifying uncertainty Recall the two key goals of inference:

More information