CLASS NOTES Models, Algorithms and Data: Introduction to computing 2018

Size: px
Start display at page:

Download "CLASS NOTES Models, Algorithms and Data: Introduction to computing 2018"

Transcription

1 CLASS NOTES Models, Alorithms and Data: Introduction to computin 2018 Petros Koumoutsakos, Jens Honore Walther (Last update: March 5, 2018) IMPORTANT DISCLAIMERS 1. REFERENCES: Much of the material (ideas, definitions, concepts, examples, etc) in these notes is taken (in some cases verbatim) for teachin purposes, from several references, and in particular the references listed below. A lare part of the lecture notes is adapted from The Nature of Mathematical Modelin by Neil Gershenfeld (Cambride University Press, 1st ed., 1998). Therefore, these notes are only informally distributed and intended ONLY as study aid for the final exam of ETHZ students that were reistered for the course Models, Alorithms and Data (MAD): Introduction to computin CONTENT: The present Notes are a first LaTex draft for the Lectures of Models, Alorithms and Data (MAD): Introduction to computin. The notes have been checked but there is no uarantee that they are free of mistakes. Please use with care. Aain, these notes are only intended for use by the ETHZ students that were reistered for the course Models, Alorithms and Data (MAD): Introduction to computin 2018 and ONLY a study aid for their final exam.

2 Chapter 2 Neural Networks 2.1 Introduction TRAINING DATA we have data {x1,y1} {xn,yn} HYPOTHESIS we make hypothesis for f f: x -> y LEARNING ALGORITHM examples so far: LSQ, steepest descent, Newton s method FINAL HYPOTHESIS Fiure 2.1: Data, hypothesis, and learnin alorithms. Example: The Netflix problem How to suest movies to a Netflix customer? We can collect the data about the customer and obtain a profile of the user (see Fiure 2.2). Then we can evaluate the similarities between all action comedy Blockbuster Tom Cruise customer movie in the data base Fiure 2.2: Profile of the customer and a iven movie in the database. movies in the database and the user s profile. The N recommended movies are the N most

3 2 Chapter 2. Neural Networks similar ones. 2.2 Learnin & functions/architectures Fiure 2.3 shows the error as a function of model complexity, i.e., number of parameters. For error under-fittin the data overfittin the data out of sample data trainin data optimal model complexity model complexity Fiure 2.3: Error as a function of model complexity for trainin and testin data. the trainin data, the error is decreasin with increased complexity. However, for the testin data (that was not used for the trainin) the error is a convex function. The minimum of this function defines the reions of under and over-fittin the data. The problem of overfittin can be checked with methods, e.., cross-validation (we split the data into fittin and testin data), bootstrappin. 2.3 Neural Network architecture There are many Neural Network (NN) types, e., forward-feed NN, convolutional NN, recurrent NN. Here, we will consider the simplest NN architecture, i.e., the fully connected forward-feed NN shown in Fiure 2.4. The leftmost layer in this network is called the input layer, and the nodes/neurons within the layer are called input nodes/neurons. The rihtmost or output layer contains the output nodes (in Fiure 2.4 a sinle output node). The middle layer is called a hidden layer. Consider a node j with I inputs x i, i = 1,.., I. The output of the node x j is iven by x j = (h j ) (2.3.1) h j = w ji x i where is the activation function and w ji are the weihts. Some of the commonly used activation functions are:

4 2.3. Neural Network architecture 3 Y NN G HK output layer node j Wkj inputs xi hj h1 hj hj hidden layer wji x x1 x2 xi xi input layer Fiure 2.4: Neuron (left) and neural network (riht) with an input layer with I input nodes, 1 hidden layer with J nodes, and an output layer with a sinle output node (K = 1). simoid: (h j ) = 1 1+e h j hyperbolic tanent : (h j ) = tanh(h j ) rectified linear units (ReLu): (h j ) = max(0, h j ) softmax or normalized exponential function: squashes a K-dimensional vector z of arbitrary real values to a K-dimensional vector σ(z) of real values in the rane [0, 1] that add up to 1, i.e., σ(z) j = for j = 1,.., K. ez j K k=1 ez k Now consider a neural network in Fiure 2.4, where the input x = (x 1, x 2,..., x I ) is a vector of size I, the hidden layer has J nodes, and the output is a vector Y NN = (Y1 NN, Y2 NN,..., YK NN) of size K, with K = 1 in our case. For nodes in the hidden layer, we can write h j = w ji x i x j = (h j ) = ( w ji x i ). (2.3.2) The value of the output node is H k = W kj x j = W kj ( w ji x i ) Y k = G(H k ). (2.3.3) We would like to train the NN on data {x l, y l }, where l = 1, 2,..., L and L is the size of the

5 4 Chapter 2. Neural Networks trainin data. The cost function E is iven by l=1 E = 1 l=l (y l Yl NN ) 2, 2 l=1 E = 1 l=l (y l G( W kj ( w ji x i ))) 2. 2 (2.3.4) The cost function can be minimized with the back-propaation method, i.e., by computin the derivative of E with respect to the weihts w, W, which are the unknown parameters of the NN. Based on the derivatives, we can update the weihts usin an optimizer. There are many optimizers, e.., radient descent, stochastic radient descent, Adam, RMSProp.

COMP 551 Applied Machine Learning Lecture 14: Neural Networks

COMP 551 Applied Machine Learning Lecture 14: Neural Networks COMP 551 Applied Machine Learning Lecture 14: Neural Networks Instructor: (jpineau@cs.mcgill.ca) Class web page: www.cs.mcgill.ca/~jpineau/comp551 Unless otherwise noted, all material posted for this course

More information

CAM Part II: Intersections

CAM Part II: Intersections CAM Part II: Intersections In the previous part, we looked at the computations of derivatives of B-Splines and NURBS. The first derivatives are of interest, since they are used in computin the tanents

More information

Natural Language Processing CS 6320 Lecture 6 Neural Language Models. Instructor: Sanda Harabagiu

Natural Language Processing CS 6320 Lecture 6 Neural Language Models. Instructor: Sanda Harabagiu Natural Language Processing CS 6320 Lecture 6 Neural Language Models Instructor: Sanda Harabagiu In this lecture We shall cover: Deep Neural Models for Natural Language Processing Introduce Feed Forward

More information

Table of Contents. What Really is a Hidden Unit? Visualizing Feed-Forward NNs. Visualizing Convolutional NNs. Visualizing Recurrent NNs

Table of Contents. What Really is a Hidden Unit? Visualizing Feed-Forward NNs. Visualizing Convolutional NNs. Visualizing Recurrent NNs Table of Contents What Really is a Hidden Unit? Visualizing Feed-Forward NNs Visualizing Convolutional NNs Visualizing Recurrent NNs Visualizing Attention Visualizing High Dimensional Data What do visualizations

More information

Learning Deep Features for One-Class Classification

Learning Deep Features for One-Class Classification 1 Learnin Deep Features for One-Class Classification Pramuditha Perera, Student Member, IEEE, and Vishal M. Patel, Senior Member, IEEE Abstract We propose a deep learnin-based solution for the problem

More information

Index. Umberto Michelucci 2018 U. Michelucci, Applied Deep Learning,

Index. Umberto Michelucci 2018 U. Michelucci, Applied Deep Learning, A Acquisition function, 298, 301 Adam optimizer, 175 178 Anaconda navigator conda command, 3 Create button, 5 download and install, 1 installing packages, 8 Jupyter Notebook, 11 13 left navigation pane,

More information

Lecture 2 Notes. Outline. Neural Networks. The Big Idea. Architecture. Instructors: Parth Shah, Riju Pahwa

Lecture 2 Notes. Outline. Neural Networks. The Big Idea. Architecture. Instructors: Parth Shah, Riju Pahwa Instructors: Parth Shah, Riju Pahwa Lecture 2 Notes Outline 1. Neural Networks The Big Idea Architecture SGD and Backpropagation 2. Convolutional Neural Networks Intuition Architecture 3. Recurrent Neural

More information

Keras: Handwritten Digit Recognition using MNIST Dataset

Keras: Handwritten Digit Recognition using MNIST Dataset Keras: Handwritten Digit Recognition using MNIST Dataset IIT PATNA February 9, 2017 1 / 24 OUTLINE 1 Introduction Keras: Deep Learning library for Theano and TensorFlow 2 Installing Keras Installation

More information

Neural Networks. CE-725: Statistical Pattern Recognition Sharif University of Technology Spring Soleymani

Neural Networks. CE-725: Statistical Pattern Recognition Sharif University of Technology Spring Soleymani Neural Networks CE-725: Statistical Pattern Recognition Sharif University of Technology Spring 2013 Soleymani Outline Biological and artificial neural networks Feed-forward neural networks Single layer

More information

SMART Notebook Gallery 2.0 beta.

SMART Notebook Gallery 2.0 beta. Quick start uide SMART Notebook Gallery 2.0 beta What s new? If you have SMART Notebook 11 collaborative learnin software, you can install and try the. If you or your system administrator installed Gallery

More information

Jurnal Teknologi MULTI OBJECTIVE MACHINING ESTIMATION MODEL USING ORTHOGONAL AND NEURAL NETWORK. Full Paper

Jurnal Teknologi MULTI OBJECTIVE MACHINING ESTIMATION MODEL USING ORTHOGONAL AND NEURAL NETWORK. Full Paper Jurnal Teknoloi ULTI OBJECTIVE ACHINING ESTIATION ODEL USING ORTHOGONAL AND NEURAL NETWORK Yusliza Yusoff a*, Azlan ohd Zain a, Safian Sharif b, Roselina Sallehuddin a a Department of Computer Science,

More information

Robust Color Image Enhancement of Digitized Books

Robust Color Image Enhancement of Digitized Books 29 1th International Conference on Document Analysis and Reconition Robust Color Imae Enhancement of Diitized Books Jian Fan Hewlett-Packard Laboratories, Palo Alto, California, USA jian.fan@hp.com Abstract

More information

Parameter Estimation for MRF Stereo

Parameter Estimation for MRF Stereo Parameter Estimation for MRF Stereo Li Zhan Steven M. Seitz University of Washinton Abstract This paper presents a novel approach for estimatin parameters for MRF-based stereo alorithms. This approach

More information

Lecture : Neural net: initialization, activations, normalizations and other practical details Anne Solberg March 10, 2017

Lecture : Neural net: initialization, activations, normalizations and other practical details Anne Solberg March 10, 2017 INF 5860 Machine learning for image classification Lecture : Neural net: initialization, activations, normalizations and other practical details Anne Solberg March 0, 207 Mandatory exercise Available tonight,

More information

Keras: Handwritten Digit Recognition using MNIST Dataset

Keras: Handwritten Digit Recognition using MNIST Dataset Keras: Handwritten Digit Recognition using MNIST Dataset IIT PATNA January 31, 2018 1 / 30 OUTLINE 1 Keras: Introduction 2 Installing Keras 3 Keras: Building, Testing, Improving A Simple Network 2 / 30

More information

Probabilistic Gaze Estimation Without Active Personal Calibration

Probabilistic Gaze Estimation Without Active Personal Calibration Probabilistic Gaze Estimation Without Active Personal Calibration Jixu Chen Qian Ji Department of Electrical,Computer and System Enineerin Rensselaer Polytechnic Institute Troy, NY 12180 chenji@e.com qji@ecse.rpi.edu

More information

Knowledge Discovery and Data Mining. Neural Nets. A simple NN as a Mathematical Formula. Notes. Lecture 13 - Neural Nets. Tom Kelsey.

Knowledge Discovery and Data Mining. Neural Nets. A simple NN as a Mathematical Formula. Notes. Lecture 13 - Neural Nets. Tom Kelsey. Knowledge Discovery and Data Mining Lecture 13 - Neural Nets Tom Kelsey School of Computer Science University of St Andrews http://tom.home.cs.st-andrews.ac.uk twk@st-andrews.ac.uk Tom Kelsey ID5059-13-NN

More information

Knowledge Discovery and Data Mining

Knowledge Discovery and Data Mining Knowledge Discovery and Data Mining Lecture 13 - Neural Nets Tom Kelsey School of Computer Science University of St Andrews http://tom.home.cs.st-andrews.ac.uk twk@st-andrews.ac.uk Tom Kelsey ID5059-13-NN

More information

Perceptron: This is convolution!

Perceptron: This is convolution! Perceptron: This is convolution! v v v Shared weights v Filter = local perceptron. Also called kernel. By pooling responses at different locations, we gain robustness to the exact spatial location of image

More information

Machine Learning. MGS Lecture 3: Deep Learning

Machine Learning. MGS Lecture 3: Deep Learning Dr Michel F. Valstar http://cs.nott.ac.uk/~mfv/ Machine Learning MGS Lecture 3: Deep Learning Dr Michel F. Valstar http://cs.nott.ac.uk/~mfv/ WHAT IS DEEP LEARNING? Shallow network: Only one hidden layer

More information

Multi-layer Perceptron Forward Pass Backpropagation. Lecture 11: Aykut Erdem November 2016 Hacettepe University

Multi-layer Perceptron Forward Pass Backpropagation. Lecture 11: Aykut Erdem November 2016 Hacettepe University Multi-layer Perceptron Forward Pass Backpropagation Lecture 11: Aykut Erdem November 2016 Hacettepe University Administrative Assignment 2 due Nov. 10, 2016! Midterm exam on Monday, Nov. 14, 2016 You are

More information

Implementing the BPG Neural Network Algorithm on. MP-1216 / MP-2216 Massive Parallel SIMD Machines. L. Larsson

Implementing the BPG Neural Network Algorithm on. MP-1216 / MP-2216 Massive Parallel SIMD Machines. L. Larsson Implementin the BPG Neural Network Alorithm on MP- / MP- Massive Parallel SIMD Machines L Larsson University of Hambur, Dept of Computer Science / Group TECH Vot-Kolln-Str 0, D-5 Hambur, Germany e-mail:

More information

Towards Plenoptic Dynamic Textures

Towards Plenoptic Dynamic Textures Towards Plenoptic Dynamic Textures Gianfranco Doretto UCLA Computer Science Department Los Aneles, CA 90095 Email: doretto@cs.ucla.edu Abstract We present a technique to infer a model of the spatio-temporal

More information

SEMANTIC COMPUTING. Lecture 8: Introduction to Deep Learning. TU Dresden, 7 December Dagmar Gromann International Center For Computational Logic

SEMANTIC COMPUTING. Lecture 8: Introduction to Deep Learning. TU Dresden, 7 December Dagmar Gromann International Center For Computational Logic SEMANTIC COMPUTING Lecture 8: Introduction to Deep Learning Dagmar Gromann International Center For Computational Logic TU Dresden, 7 December 2018 Overview Introduction Deep Learning General Neural Networks

More information

Deep Learning with Tensorflow AlexNet

Deep Learning with Tensorflow   AlexNet Machine Learning and Computer Vision Group Deep Learning with Tensorflow http://cvml.ist.ac.at/courses/dlwt_w17/ AlexNet Krizhevsky, Alex, Ilya Sutskever, and Geoffrey E. Hinton, "Imagenet classification

More information

Multiclass Multidimensional Modified Haar DWT for Classification Function Representation and its Realization via Fast Fixed-Depth Network

Multiclass Multidimensional Modified Haar DWT for Classification Function Representation and its Realization via Fast Fixed-Depth Network Multiclass Multidimensional Modified Haar DWT for Classification Function Representation and its Realization via Fast Fixed-Depth Network Rory Mulvaney and Dhananjay S. Phatak Abstract The core contribution

More information

Optimization. Industrial AI Lab.

Optimization. Industrial AI Lab. Optimization Industrial AI Lab. Optimization An important tool in 1) Engineering problem solving and 2) Decision science People optimize Nature optimizes 2 Optimization People optimize (source: http://nautil.us/blog/to-save-drowning-people-ask-yourself-what-would-light-do)

More information

Deep Learning. Practical introduction with Keras JORDI TORRES 27/05/2018. Chapter 3 JORDI TORRES

Deep Learning. Practical introduction with Keras JORDI TORRES 27/05/2018. Chapter 3 JORDI TORRES Deep Learning Practical introduction with Keras Chapter 3 27/05/2018 Neuron A neural network is formed by neurons connected to each other; in turn, each connection of one neural network is associated

More information

Neural Network Learning. Today s Lecture. Continuation of Neural Networks. Artificial Neural Networks. Lecture 24: Learning 3. Victor R.

Neural Network Learning. Today s Lecture. Continuation of Neural Networks. Artificial Neural Networks. Lecture 24: Learning 3. Victor R. Lecture 24: Learning 3 Victor R. Lesser CMPSCI 683 Fall 2010 Today s Lecture Continuation of Neural Networks Artificial Neural Networks Compose of nodes/units connected by links Each link has a numeric

More information

Artificial Neural Networks Lecture Notes Part 5. Stephen Lucci, PhD. Part 5

Artificial Neural Networks Lecture Notes Part 5. Stephen Lucci, PhD. Part 5 Artificial Neural Networks Lecture Notes Part 5 About this file: If you have trouble reading the contents of this file, or in case of transcription errors, email gi0062@bcmail.brooklyn.cuny.edu Acknowledgments:

More information

Lecture 20: Neural Networks for NLP. Zubin Pahuja

Lecture 20: Neural Networks for NLP. Zubin Pahuja Lecture 20: Neural Networks for NLP Zubin Pahuja zpahuja2@illinois.edu courses.engr.illinois.edu/cs447 CS447: Natural Language Processing 1 Today s Lecture Feed-forward neural networks as classifiers simple

More information

THE FEASIBILITY OF SMOOTHED PARTICLE HYDRODYNAMICS FOR MULTIPHASE OILFIELD SYSTEMS

THE FEASIBILITY OF SMOOTHED PARTICLE HYDRODYNAMICS FOR MULTIPHASE OILFIELD SYSTEMS Seventh International Conference on CFD in the Minerals and Process Industries CSIRO, Melbourne, Australia 9-11 December 009 THE FEASIBILITY OF SMOOTHED PARTICLE HYDRODYNAMICS FOR MULTIPHASE OILFIELD SYSTEMS

More information

The exam is closed book, closed notes except your one-page cheat sheet.

The exam is closed book, closed notes except your one-page cheat sheet. CS 189 Fall 2015 Introduction to Machine Learning Final Please do not turn over the page before you are instructed to do so. You have 2 hours and 50 minutes. Please write your initials on the top-right

More information

Affinity Hybrid Tree: An Indexing Technique for Content-Based Image Retrieval in Multimedia Databases

Affinity Hybrid Tree: An Indexing Technique for Content-Based Image Retrieval in Multimedia Databases Affinity Hybrid Tree: An Indexin Technique for Content-Based Imae Retrieval in Multimedia Databases Kasturi Chatterjee and Shu-Chin Chen Florida International University Distributed Multimedia Information

More information

CSC 578 Neural Networks and Deep Learning

CSC 578 Neural Networks and Deep Learning CSC 578 Neural Networks and Deep Learning Fall 2018/19 7. Recurrent Neural Networks (Some figures adapted from NNDL book) 1 Recurrent Neural Networks 1. Recurrent Neural Networks (RNNs) 2. RNN Training

More information

Deep Learning. Architecture Design for. Sargur N. Srihari

Deep Learning. Architecture Design for. Sargur N. Srihari Architecture Design for Deep Learning Sargur N. srihari@cedar.buffalo.edu 1 Topics Overview 1. Example: Learning XOR 2. Gradient-Based Learning 3. Hidden Units 4. Architecture Design 5. Backpropagation

More information

B15 Enhancement of Linear Features from Gravity Anomalies by Using Curvature Gradient Tensor Matrix

B15 Enhancement of Linear Features from Gravity Anomalies by Using Curvature Gradient Tensor Matrix B5 Enhancement of Linear Features from Gravity Anomalies by Usin Curvature Gradient Tensor Matrix B. Oruç* (Kocaeli University) SUMMARY In this study, a new ede enhancement technique based on the eienvalues

More information

CPSC 340: Machine Learning and Data Mining. Deep Learning Fall 2016

CPSC 340: Machine Learning and Data Mining. Deep Learning Fall 2016 CPSC 340: Machine Learning and Data Mining Deep Learning Fall 2016 Assignment 5: Due Friday. Assignment 6: Due next Friday. Final: Admin December 12 (8:30am HEBB 100) Covers Assignments 1-6. Final from

More information

LECTURE NOTES Professor Anita Wasilewska NEURAL NETWORKS

LECTURE NOTES Professor Anita Wasilewska NEURAL NETWORKS LECTURE NOTES Professor Anita Wasilewska NEURAL NETWORKS Neural Networks Classifier Introduction INPUT: classification data, i.e. it contains an classification (class) attribute. WE also say that the class

More information

Local linear regression for soft-sensor design with application to an industrial deethanizer

Local linear regression for soft-sensor design with application to an industrial deethanizer Preprints of the 8th IFAC World Conress Milano (Italy) Auust 8 - September, Local linear reression for soft-sensor desin with application to an industrial deethanizer Zhanxin Zhu Francesco Corona Amaury

More information

The exam is closed book, closed notes except your one-page (two-sided) cheat sheet.

The exam is closed book, closed notes except your one-page (two-sided) cheat sheet. CS 189 Spring 2015 Introduction to Machine Learning Final You have 2 hours 50 minutes for the exam. The exam is closed book, closed notes except your one-page (two-sided) cheat sheet. No calculators or

More information

CMU Lecture 18: Deep learning and Vision: Convolutional neural networks. Teacher: Gianni A. Di Caro

CMU Lecture 18: Deep learning and Vision: Convolutional neural networks. Teacher: Gianni A. Di Caro CMU 15-781 Lecture 18: Deep learning and Vision: Convolutional neural networks Teacher: Gianni A. Di Caro DEEP, SHALLOW, CONNECTED, SPARSE? Fully connected multi-layer feed-forward perceptrons: More powerful

More information

Enhancing Computational Promise of Neural Optimization for Graph-Theoretic Problems in Real-Time Environments

Enhancing Computational Promise of Neural Optimization for Graph-Theoretic Problems in Real-Time Environments DCDIS A Supplement, Advances in eural etwors, Vol. 4(S) 68--76 Copyriht@007 Watam Press nhancin Computational Promise of eural Optimiation for Graph-Theoretic Problems in Real-Time nvironments GURSL SRP

More information

Optimization. 1. Optimization. by Prof. Seungchul Lee Industrial AI Lab POSTECH. Table of Contents

Optimization. 1. Optimization. by Prof. Seungchul Lee Industrial AI Lab  POSTECH. Table of Contents Optimization by Prof. Seungchul Lee Industrial AI Lab http://isystems.unist.ac.kr/ POSTECH Table of Contents I. 1. Optimization II. 2. Solving Optimization Problems III. 3. How do we Find x f(x) = 0 IV.

More information

Imitation: An Alternative to Generalization in Programming by Demonstration Systems

Imitation: An Alternative to Generalization in Programming by Demonstration Systems Imitation: An Alternative to Generalization in Prorammin by Demonstration Systems Technical Report UW-CSE-98-08-06 Amir Michail University of Washinton amir@cs.washinton.edu http://www.cs.washinton.edu/homes/amir/opsis.html

More information

Artificial Intelligence

Artificial Intelligence Artificial Intelligence AI & Machine Learning & Neural Nets Marc Toussaint University of Stuttgart Winter 2018/19 Motivation: Neural networks became a central topic for Machine Learning and AI. But in

More information

Pattern Recognition Letters

Pattern Recognition Letters attern Reconition Letters 29 (2008) 1596 1602 Contents lists available at ScienceDirect attern Reconition Letters journal homepae: www.elsevier.com/locate/patrec 3D face reconition by constructin deformation

More information

A Quick Guide on Training a neural network using Keras.

A Quick Guide on Training a neural network using Keras. A Quick Guide on Training a neural network using Keras. TensorFlow and Keras Keras Open source High level, less flexible Easy to learn Perfect for quick implementations Starts by François Chollet from

More information

Artificial Neural Network Methodology for Modelling and Forecasting Maize Crop Yield

Artificial Neural Network Methodology for Modelling and Forecasting Maize Crop Yield Agricultural Economics Research Review Vol. 21 January-June 2008 pp 5-10 Artificial Neural Network Methodology for Modelling and Forecasting Maize Crop Yield Rama Krishna Singh and Prajneshu * Biometrics

More information

Machine Learning 13. week

Machine Learning 13. week Machine Learning 13. week Deep Learning Convolutional Neural Network Recurrent Neural Network 1 Why Deep Learning is so Popular? 1. Increase in the amount of data Thanks to the Internet, huge amount of

More information

M. Sc. (Artificial Intelligence and Machine Learning)

M. Sc. (Artificial Intelligence and Machine Learning) Course Name: Advanced Python Course Code: MSCAI 122 This course will introduce students to advanced python implementations and the latest Machine Learning and Deep learning libraries, Scikit-Learn and

More information

f y f x f z exu f xu syu s y s x s zl ezl f zl s zu ezu f zu sign logic significand multiplier exponent adder inc multiplexor multiplexor ty

f y f x f z exu f xu syu s y s x s zl ezl f zl s zu ezu f zu sign logic significand multiplier exponent adder inc multiplexor multiplexor ty A Combined Interval and Floatin Point Multiplier James E. Stine and Michael J. Schulte Computer Architecture and Arithmetic Laboratory Electrical Enineerin and Computer Science Department Lehih University

More information

Gradient Descent. Wed Sept 20th, James McInenrey Adapted from slides by Francisco J. R. Ruiz

Gradient Descent. Wed Sept 20th, James McInenrey Adapted from slides by Francisco J. R. Ruiz Gradient Descent Wed Sept 20th, 2017 James McInenrey Adapted from slides by Francisco J. R. Ruiz Housekeeping A few clarifications of and adjustments to the course schedule: No more breaks at the midpoint

More information

Module. Sanko Lan Avi Ziv Abbas El Gamal. and its accompanying FPGA CAD tools, we are focusing on

Module. Sanko Lan Avi Ziv Abbas El Gamal. and its accompanying FPGA CAD tools, we are focusing on Placement and Routin For A Field Prorammable Multi-Chip Module Sanko Lan Avi Ziv Abbas El Gamal Information Systems Laboratory, Stanford University, Stanford, CA 94305 Abstract Placemen t and routin heuristics

More information

Machine Learning: Chenhao Tan University of Colorado Boulder LECTURE 15

Machine Learning: Chenhao Tan University of Colorado Boulder LECTURE 15 Machine Learning: Chenhao Tan University of Colorado Boulder LECTURE 15 Slides adapted from Jordan Boyd-Graber Machine Learning: Chenhao Tan Boulder 1 of 21 Logistics HW3 available on Github, due on October

More information

Learning. Learning agents Inductive learning. Neural Networks. Different Learning Scenarios Evaluation

Learning. Learning agents Inductive learning. Neural Networks. Different Learning Scenarios Evaluation Learning Learning agents Inductive learning Different Learning Scenarios Evaluation Slides based on Slides by Russell/Norvig, Ronald Williams, and Torsten Reil Material from Russell & Norvig, chapters

More information

Chapter 5 THE MODULE FOR DETERMINING AN OBJECT S TRUE GRAY LEVELS

Chapter 5 THE MODULE FOR DETERMINING AN OBJECT S TRUE GRAY LEVELS Qian u Chapter 5. Determinin an Object s True Gray evels 3 Chapter 5 THE MODUE OR DETERMNNG AN OJECT S TRUE GRAY EVES This chapter discusses the module for determinin an object s true ray levels. To compute

More information

CMPT 882 Week 3 Summary

CMPT 882 Week 3 Summary CMPT 882 Week 3 Summary! Artificial Neural Networks (ANNs) are networks of interconnected simple units that are based on a greatly simplified model of the brain. ANNs are useful learning tools by being

More information

Deep Neural Networks Optimization

Deep Neural Networks Optimization Deep Neural Networks Optimization Creative Commons (cc) by Akritasa http://arxiv.org/pdf/1406.2572.pdf Slides from Geoffrey Hinton CSC411/2515: Machine Learning and Data Mining, Winter 2018 Michael Guerzhoy

More information

Bus-Based Communication Synthesis on System-Level

Bus-Based Communication Synthesis on System-Level Bus-Based Communication Synthesis on System-Level Michael Gasteier Manfred Glesner Darmstadt University of Technoloy Institute of Microelectronic Systems Karlstrasse 15, 64283 Darmstadt, Germany Abstract

More information

Non-Linear Text Regression with a Deep Convolutional Neural Network

Non-Linear Text Regression with a Deep Convolutional Neural Network Non-Linear Text Reression with a Deep Convolutional Neural Network Zsolt Bitvai University of Sheffield, UK z.bitvai@shef.ac.uk Trevor Cohn University of Melbourne, Australia t.cohn@unimelb.edu.au Abstract

More information

COMP9444 Neural Networks and Deep Learning 5. Geometry of Hidden Units

COMP9444 Neural Networks and Deep Learning 5. Geometry of Hidden Units COMP9 8s Geometry of Hidden Units COMP9 Neural Networks and Deep Learning 5. Geometry of Hidden Units Outline Geometry of Hidden Unit Activations Limitations of -layer networks Alternative transfer functions

More information

Data-flow-based Testing of Object-Oriented Libraries

Data-flow-based Testing of Object-Oriented Libraries Data-flow-based Testin of Object-Oriented Libraries Ramkrishna Chatterjee Oracle Corporation One Oracle Drive Nashua, NH 03062, USA ph: +1 603 897 3515 Ramkrishna.Chatterjee@oracle.com Barbara G. Ryder

More information

Multi-Class Protein Fold Recognition Using Multi-Objective Evolutionary Algorithms

Multi-Class Protein Fold Recognition Using Multi-Objective Evolutionary Algorithms 1 Multi-Class Protein Fold Reconition Usin Multi-bjective Evolutionary Alorithms Stanley Y. M. Shi, P. N. Suanthan, Senior Member IEEE and Kalyanmoy Deb, Senior Member IEEE KanGAL Report Number 2004007

More information

The exam is closed book, closed notes except your one-page (two-sided) cheat sheet.

The exam is closed book, closed notes except your one-page (two-sided) cheat sheet. CS 189 Spring 2015 Introduction to Machine Learning Final You have 2 hours 50 minutes for the exam. The exam is closed book, closed notes except your one-page (two-sided) cheat sheet. No calculators or

More information

2. Neural network basics

2. Neural network basics 2. Neural network basics Next commonalities among different neural networks are discussed in order to get started and show which structural parts or concepts appear in almost all networks. It is presented

More information

Code Mania Artificial Intelligence: a. Module - 1: Introduction to Artificial intelligence and Python:

Code Mania Artificial Intelligence: a. Module - 1: Introduction to Artificial intelligence and Python: Code Mania 2019 Artificial Intelligence: a. Module - 1: Introduction to Artificial intelligence and Python: 1. Introduction to Artificial Intelligence 2. Introduction to python programming and Environment

More information

Recurrent Neural Nets II

Recurrent Neural Nets II Recurrent Neural Nets II Steven Spielberg Pon Kumar, Tingke (Kevin) Shen Machine Learning Reading Group, Fall 2016 9 November, 2016 Outline 1 Introduction 2 Problem Formulations with RNNs 3 LSTM for Optimization

More information

Neural networks. About. Linear function approximation. Spyros Samothrakis Research Fellow, IADS University of Essex.

Neural networks. About. Linear function approximation. Spyros Samothrakis Research Fellow, IADS University of Essex. Neural networks Spyros Samothrakis Research Fellow, IADS University of Essex About Linear function approximation with SGD From linear regression to neural networks Practical aspects February 28, 2017 Conclusion

More information

ABSTRACT 1. INTRODUCTION 2. RELATED WORKS

ABSTRACT 1. INTRODUCTION 2. RELATED WORKS 3D-DETNet: a Sinle Stae Video-Based Vehicle Detector Suichan Li School of Information Science and Technoloy, University of Science and Technoloy of China, Hefei 230026, China. ABSTRACT Video-based vehicle

More information

Machine Learning. The Breadth of ML Neural Networks & Deep Learning. Marc Toussaint. Duy Nguyen-Tuong. University of Stuttgart

Machine Learning. The Breadth of ML Neural Networks & Deep Learning. Marc Toussaint. Duy Nguyen-Tuong. University of Stuttgart Machine Learning The Breadth of ML Neural Networks & Deep Learning Marc Toussaint University of Stuttgart Duy Nguyen-Tuong Bosch Center for Artificial Intelligence Summer 2017 Neural Networks Consider

More information

Survey On Image Texture Classification Techniques

Survey On Image Texture Classification Techniques Survey On Imae Texture Classification Techniques Vishal SThakare 1, itin Patil 2 and Jayshri S Sonawane 3 1 Department of Computer En, orth Maharashtra University, RCPIT Shirpur, Maharashtra, India 2 Department

More information

Straiht Line Detection Any straiht line in 2D space can be represented by this parametric euation: x; y; ; ) =x cos + y sin, =0 To nd the transorm o a

Straiht Line Detection Any straiht line in 2D space can be represented by this parametric euation: x; y; ; ) =x cos + y sin, =0 To nd the transorm o a Houh Transorm E186 Handout Denition The idea o Houh transorm is to describe a certain line shape straiht lines, circles, ellipses, etc.) lobally in a parameter space { the Houh transorm domain. We assume

More information

Deep neural networks II

Deep neural networks II Deep neural networks II May 31 st, 2018 Yong Jae Lee UC Davis Many slides from Rob Fergus, Svetlana Lazebnik, Jia-Bin Huang, Derek Hoiem, Adriana Kovashka, Why (convolutional) neural networks? State of

More information

The Role of Switching in Reducing the Number of Electronic Ports in WDM Networks

The Role of Switching in Reducing the Number of Electronic Ports in WDM Networks 1 The Role of Switchin in Reducin the Number of Electronic Ports in WDM Networks Randall A. Berry and Eytan Modiano Abstract We consider the role of switchin in minimizin the number of electronic ports

More information

Machine Learning. Deep Learning. Eric Xing (and Pengtao Xie) , Fall Lecture 8, October 6, Eric CMU,

Machine Learning. Deep Learning. Eric Xing (and Pengtao Xie) , Fall Lecture 8, October 6, Eric CMU, Machine Learning 10-701, Fall 2015 Deep Learning Eric Xing (and Pengtao Xie) Lecture 8, October 6, 2015 Eric Xing @ CMU, 2015 1 A perennial challenge in computer vision: feature engineering SIFT Spin image

More information

11/14/2010 Intelligent Systems and Soft Computing 1

11/14/2010 Intelligent Systems and Soft Computing 1 Lecture 7 Artificial neural networks: Supervised learning Introduction, or how the brain works The neuron as a simple computing element The perceptron Multilayer neural networks Accelerated learning in

More information

Deep Learning for Computer Vision

Deep Learning for Computer Vision Deep Learning for Computer Vision Lecture 7: Universal Approximation Theorem, More Hidden Units, Multi-Class Classifiers, Softmax, and Regularization Peter Belhumeur Computer Science Columbia University

More information

Artificial neural networks are the paradigm of connectionist systems (connectionism vs. symbolism)

Artificial neural networks are the paradigm of connectionist systems (connectionism vs. symbolism) Artificial Neural Networks Analogy to biological neural systems, the most robust learning systems we know. Attempt to: Understand natural biological systems through computational modeling. Model intelligent

More information

Hello Edge: Keyword Spotting on Microcontrollers

Hello Edge: Keyword Spotting on Microcontrollers Hello Edge: Keyword Spotting on Microcontrollers Yundong Zhang, Naveen Suda, Liangzhen Lai and Vikas Chandra ARM Research, Stanford University arxiv.org, 2017 Presented by Mohammad Mofrad University of

More information

Converting Handwritten Mathematical Expressions into

Converting Handwritten Mathematical Expressions into CS 229 Final Project Paper Stanford University Converting Handwritten Mathematical Expressions into LATEX Amit Schechter Norah Borus William Bakst amitsch nborus wbakst December 15, 2017 1 Introduction

More information

Guarding Monotone Polygons with Half-Guards

Guarding Monotone Polygons with Half-Guards CCCG 2017, Ottawa, Ontario, July 26 28, 2017 Guardin Monotone Polyons with Half-Guards Matt Gibson Erik Krohn Matthew Rayford Abstract We consider a variant of the art allery problem where all uards are

More information

The Mathematics Behind Neural Networks

The Mathematics Behind Neural Networks The Mathematics Behind Neural Networks Pattern Recognition and Machine Learning by Christopher M. Bishop Student: Shivam Agrawal Mentor: Nathaniel Monson Courtesy of xkcd.com The Black Box Training the

More information

Universal Value Function Approximators

Universal Value Function Approximators Tom Schaul Dan Horan Karol Greor David Silver Goole Deepd, 5 New Street Square, EC4A 3TW London Abstract Value functions are a core component of reinforcement learnin systems. The main idea is to to construct

More information

Introduction to Neural Networks

Introduction to Neural Networks Introduction to Neural Networks Jakob Verbeek 2017-2018 Biological motivation Neuron is basic computational unit of the brain about 10^11 neurons in human brain Simplified neuron model as linear threshold

More information

Congestion pricing and user adaptation

Congestion pricing and user adaptation Conestion pricin and user adaptation Ayalvadi Ganesh, Koenraad Laevens Microsoft Research 1 Guildhall Street, Cambride CB2 3NH, UK Richard Steinber Jude Institute of Manaement Studies University of Cambride,

More information

Lecture 17: Neural Networks and Deep Learning. Instructor: Saravanan Thirumuruganathan

Lecture 17: Neural Networks and Deep Learning. Instructor: Saravanan Thirumuruganathan Lecture 17: Neural Networks and Deep Learning Instructor: Saravanan Thirumuruganathan Outline Perceptron Neural Networks Deep Learning Convolutional Neural Networks Recurrent Neural Networks Auto Encoders

More information

Learning Bayesian Networks from Inaccurate Data

Learning Bayesian Networks from Inaccurate Data Learnin Bayesian Networks from Inaccurate Data A Study on the Effect of Inaccurate Data on Parameter Estimation and Structure Learnin of Bayesian Networks Valerie Sessions Marco Valtorta Department of

More information

DEEP LEARNING REVIEW. Yann LeCun, Yoshua Bengio & Geoffrey Hinton Nature Presented by Divya Chitimalla

DEEP LEARNING REVIEW. Yann LeCun, Yoshua Bengio & Geoffrey Hinton Nature Presented by Divya Chitimalla DEEP LEARNING REVIEW Yann LeCun, Yoshua Bengio & Geoffrey Hinton Nature 2015 -Presented by Divya Chitimalla What is deep learning Deep learning allows computational models that are composed of multiple

More information

Deep Learning and Its Applications

Deep Learning and Its Applications Convolutional Neural Network and Its Application in Image Recognition Oct 28, 2016 Outline 1 A Motivating Example 2 The Convolutional Neural Network (CNN) Model 3 Training the CNN Model 4 Issues and Recent

More information

Global Optimality in Neural Network Training

Global Optimality in Neural Network Training Global Optimality in Neural Network Training Benjamin D. Haeffele and René Vidal Johns Hopkins University, Center for Imaging Science. Baltimore, USA Questions in Deep Learning Architecture Design Optimization

More information

Machine Learning. Chao Lan

Machine Learning. Chao Lan Machine Learning Chao Lan Machine Learning Prediction Models Regression Model - linear regression (least square, ridge regression, Lasso) Classification Model - naive Bayes, logistic regression, Gaussian

More information

COMP9444 Neural Networks and Deep Learning 7. Image Processing. COMP9444 c Alan Blair, 2017

COMP9444 Neural Networks and Deep Learning 7. Image Processing. COMP9444 c Alan Blair, 2017 COMP9444 Neural Networks and Deep Learning 7. Image Processing COMP9444 17s2 Image Processing 1 Outline Image Datasets and Tasks Convolution in Detail AlexNet Weight Initialization Batch Normalization

More information

SCALE SELECTIVE EXTENDED LOCAL BINARY PATTERN FOR TEXTURE CLASSIFICATION. Yuting Hu, Zhiling Long, and Ghassan AlRegib

SCALE SELECTIVE EXTENDED LOCAL BINARY PATTERN FOR TEXTURE CLASSIFICATION. Yuting Hu, Zhiling Long, and Ghassan AlRegib SCALE SELECTIVE EXTENDED LOCAL BINARY PATTERN FOR TEXTURE CLASSIFICATION Yutin Hu, Zhilin Lon, and Ghassan AlReib Multimedia & Sensors Lab (MSL) Center for Sinal and Information Processin (CSIP) School

More information

Learning Geometric Concepts with an Evolutionary Algorithm. Andreas Birk. Universitat des Saarlandes, c/o Lehrstuhl Prof. W.J.

Learning Geometric Concepts with an Evolutionary Algorithm. Andreas Birk. Universitat des Saarlandes, c/o Lehrstuhl Prof. W.J. Learnin Geometric Concepts with an Evolutionary Alorithm Andreas Birk Universitat des Saarlandes, c/o Lehrstuhl Prof. W.J. Paul Postfach 151150, 66041 Saarbrucken, Germany cyrano@cs.uni-sb.de http://www-wjp.cs.uni-sb.de/cyrano/

More information

Neural Networks. Theory And Practice. Marco Del Vecchio 19/07/2017. Warwick Manufacturing Group University of Warwick

Neural Networks. Theory And Practice. Marco Del Vecchio 19/07/2017. Warwick Manufacturing Group University of Warwick Neural Networks Theory And Practice Marco Del Vecchio marco@delvecchiomarco.com Warwick Manufacturing Group University of Warwick 19/07/2017 Outline I 1 Introduction 2 Linear Regression Models 3 Linear

More information

Protection of Location Privacy using Dummies for Location-based Services

Protection of Location Privacy using Dummies for Location-based Services Protection of Location Privacy usin for Location-based Services Hidetoshi Kido y Yutaka Yanaisawa yy Tetsuji Satoh y;yy ygraduate School of Information Science and Technoloy, Osaka University yyntt Communication

More information

Object Detection Lecture Introduction to deep learning (CNN) Idar Dyrdal

Object Detection Lecture Introduction to deep learning (CNN) Idar Dyrdal Object Detection Lecture 10.3 - Introduction to deep learning (CNN) Idar Dyrdal Deep Learning Labels Computational models composed of multiple processing layers (non-linear transformations) Used to learn

More information

Chapter - 1 : IMAGE FUNDAMENTS

Chapter - 1 : IMAGE FUNDAMENTS Chapter - : IMAGE FUNDAMENTS Imae processin is a subclass of sinal processin concerned specifically with pictures. Improve imae quality for human perception and/or computer interpretation. Several fields

More information

For Monday. Read chapter 18, sections Homework:

For Monday. Read chapter 18, sections Homework: For Monday Read chapter 18, sections 10-12 The material in section 8 and 9 is interesting, but we won t take time to cover it this semester Homework: Chapter 18, exercise 25 a-b Program 4 Model Neuron

More information