INTELLIGENT PROCESS SELECTION FOR NTM - A NEURAL NETWORK APPROACH
|
|
- Ralph Taylor
- 5 years ago
- Views:
Transcription
1 International Journal of Industrial Engineering Research and Development (IJIERD), ISSN (Print), ISSN (Online) Volume 1, Number 1, July - Aug (2010), pp IAEME, International Journal of Industrial Engineering Research and Development (IJIERD), IJIERD I A E M E INTELLIGENT PROCESS SELECTION FOR NTM - A ABSTRACT: NEURAL NETWORK APPROACH V. Sugumaran Department of Mechatronics Engineering SRM University, Kancheepuram v_sugu@yahoo.com V. Muralidharan Department of Mechatronics Engineering SRM University, Kancheepuram Bharath Kumar Hegde Department of Mechatronics Engineering SRM University, Kancheepuram Ravi Tea C Department of Mechatronics Engineering SRM University, Kancheepuram Decision-maing is an important phase in the manufacturing enterprises to complete in the global competition. The rapid industrial expansion is demanding the need for better quality decisions in the shortest possible time. The development of Nontraditional machining process is the result of a desire to deal with difficult to machine materials at a faster rate at lower cost with best possible quality. To meet all these requirements, research wor is going on in manufacturing industries particularly Nuclear and Aerospace engineering industries. Despite being successful in solving many manufacturing problem, Non-traditional manufacturing also pose an important restriction in the selection of appropriate processes for a particular machining problem. In practice, no single process is capable of satisfying wide variety of machining problems. This nonversatility of the Non-traditional machining processes necessitates an intelligent system in this domain. The selection procedure described in this paper is intended as a general- 87
2 purpose aid to the designer in maing preliminary selections of non-traditional machining process for a given part. In the proposed procedure, wor materials, shape machined, operational capabilities such as minimum tolerance, minimum surface finish, minimum corner radii, minimum hole diameter, maximum depth to diameter ratio and maximum thicness of wor piece are included. Based on the required part characteristics, the proposed neural networ generates a list of non-traditional machining processes to produce a particular part. This list helps a designer in identifying possible alternatives early in the design process. A neural networ tool Neuralyst has been used for the development of system for Non-traditional machining. It uses Pattern matching/ associative memory. The networ was trained and parameters are optimized for better results. Keywords: Artificial neural networ, Neuralyst, Pattern recognition. 1.0 NONTRADITIONAL MACHINING Since the 1940s, a revolution in manufacturing has been taing place that once again allows manufacturers to meet the demands imposed by increasingly sophisticated designs and durable, but in many cases nearly unmachinable materials. This manufacturing revolution is now, as it has been in the past, centred in the use of new tools and new forms of energy. The result has been the introduction of new manufacturing processes nown as Nontraditional machining (NTM) processes. The conventional manufacturing processes rely on electric motors and hard tool to perform the desired operation. In contrast, Nontraditional-machining processes can be accomplished with electrochemical reactions, high temperature plasmas, and high velocity et of liquids and abrasives etc. There are over 20 different Nontraditional processes have been invented and implemented successfully into production. Each process has its own characteristic attributes and limitations; hence no one process is best for all manufacturing situations. So there is a need for a tool to assist the production/design engineer to select a appropriate process for a given situation. In this paper, an attempt is made to use Artificial Neural Networ (ANN) as a tool to perform this tas. The parameters of NTM lie minimum tolerance, minimum surface finish, minimum corner radius, minimum hole diameter, minimum over cut and maximum depth to diameter ratio etc. are considered as 88
3 process capabilities for the process selection. 11 NTM process are taen and the corresponding process capabilities are given in the table 1. Min. Min. Min. Min. Min. Min. Min. Min. Maximum Tolerance Surface Surface Corner Taper Hole Width Over Depth to Process finish Damage Radius Dia of cut Cut Dia ratio (mm) (CLA) (µm) (mm) (mm/mm) (mm) (mm) (mm) EDM ECM ECG N.A N.A N.A 0.13 N.A ECH N.A N.A N.A N.A N.A N.A N.A AJM N.A 10 WJM N.A N.A N.A N.A 30 USM N.A 2.5 CHM N.A 3 LBM N.A 15 EBM N.A N.A 20 WEDM N.A Table 1 NTM processes and Corresponding parameters 2.0 ARTIFICIAL NEURAL NETWORKS Artificial neural networs (ANN) are modeled on biological neurons and nervous systems. They have the ability to learn and the processing elements, nown as neurons perform their operations in parallel. ANN s are characterized by their topology, weight vector and activation functions. They have three layers namely an input layer, which receives signals from the external world, a hidden layer, which does the processing of the signals and an output layer, which gives the result bac to the external world. Various neural networ structures are available. The review of literature reveals that both supervised learning and unsupervised learning have been applied in similar problems. 2.1 MULTI-LAYER PERCEPTRON (MLP) This is an important class of neural networs, namely the feed forward networs. Typically, the networ consists of a set of input parameters that constitute the input layer, one or more hidden layers of computation nodes and an output layer of computation nodes (Figure 1). The input signal propagates through the networ in a forward direction on a layer-by-layer basis. 89
4 Figure 1 Multi layer networ MLPs have been applied to solve some difficult and diverse problems by training them in a supervised manner with a highly popular algorithm nown as the error bacpropagation algorithm. Each neuron in the hidden and output layer consists of a activation function, which is generally a non linear function lie the logistic function which is given by 1 f ( x) =, (1) 1 x + e Where f(x) is differentiable and x = W i i=1 ξ + θ I Where Wi is the weight vector connecting the ith neuron of the input layer to the th neuron of the hidden layer, ξi is the input vector and θ is the threshold of the th neuron of the hidden layer. Similarly, Wi is the weight vector connecting th neuron of the hidden layer with the th neuron of the output layer. i represents the input layer, - represents the hidden layer and -represents the output layer. The weights that are important in predicting the process are unnown. The weights of the networ to be trained are initialized to small random values. The choice of value selected obviously affects the rate of convergence. The weights are updated through an iterative learning process nown as error bac-propagation (BP) algorithm. Error bac-propagation process consists of two passes through the different layers of the networ; a forward pass in which input patterns are presented to the input layer of the networ and its effect propagates through the networ layer by layer. Finally, a set of outputs is produced as the (2) 90
5 actual response of the networ. During the forward pass the synaptic weights if the networ are all fixed. The error value is then calculated, which is the mean square error (MSE) given by E 1 = n tot E n n n= 1 Where, m 1 n E n = ( ζ O 2 = 1 n Where, m is the number of neurons in the output layer, n ζ ) 2 is the th component of the desired or target output vector and n O is the th component of the output vector. The weights in the lins connecting the output and the hidden layer W are modified as follows: W = η ( E / W ) = ηδ y, where η is the learning rate. Considering the momentum (3) term (α) new old W = αηδ y and W = W + W. Similarly the weights in the lins connecting the hidden and input layer W are modified as follows: W = αηδ ξ, (4) I Where, δ = y ( 1 y ) δ W. m = 1 W new i = W + W (5) old i i δ = ξ O ) O (1 O ) for output neurons and (6) ( m δ = y ( 1 y ) δ W for hidden neurons. (7) = 1 The training process is carried out until the total error reaches an acceptable level (threshold). If Etot < Emin the training process is stopped and the final weights are stored, which is used in the testing phase for determining the performance of the developed networ. The training mode adopted was batch mode, where weight updating was performed after the presentation of all training examples that constitutes an epoch. 91
6 2.2 NEURAL NETWORK MODELING The following is brief introduction to each step of training and validating neural networ. 1. Determine the structure of ANN. 2. Divide the input and output nown data into two groups, the first to be used to train the networ, the second to be used to validate the networ in an out-of-sample experiment. 3. Scale all input variables and the desired output variables to the range of 0 to Set initial weights and start a training epoch using the training data set. 5. Input scaled variables. 6. Distribute the scaled inputs to each hidden node. 7. Weigh and sum inputs to receiving nodes. 8. Transform hidden-node inputs to outputs. 9. Weight and sum hidden node outputs as inputs to output nodes. 10. Transform inputs at the output nodes. 11. Calculate the output errors 12. Bac-propagate errors to adust weights. 13. Continue the epoch. 14. Calculate the epoch RMS value of the error. 15. Judge output the sample validity. 16. Use the model for forecasting. 2.3 NEURAL NETWORK ARCHITECTURE The neural networ model definitions and model architecture is as follows: Networ type : Feed forward neural networ No. of nodes in input layer : 9 No. of hidden layers : 1 No. of neurons in hidden layer : 12 Output layer : 11 Transfer function : Sigmoid transfer function in hidden and output layers 92
7 Training rule : Bac propagation Learning rule : Momentum learning method Momentum learning step size : 0.1 Momentum learning rate : 0.9 No. of epochs : 451 Training termination : Minimum mean square error 3.0 TRAINING AND TESTING OF NEURAL NETWORK The data used for training the networ is shown in the table. Eleven parameters of the NTM processes are taen as input to the networ. Each output node represents one process. There are 11 output nodes in the output layer. The basic principle behind the neural networ is the input space variables are mapped to a higher dimensional feature space where the variables are linearly separable. Hence, the hidden layer should have at least one node greater than number of nodes in the input layer. In this case hidden layer has 12 nodes. There is no thumb rule to set the networ parameters such as number of hidden layers and testing tolerance, learning rate. So, eeping other parameters constant the effect of testing tolerance and number of nodes in hidden layers are experimented with various values and the results are presented in the form of graph (shown in Figure 2, Figure 3 and Figure 4). The testing data are given close to particular process to chec the accuracy of the networ. The results are shown in Figure 4. Training Tolerance Vs No. of epoches No. of epoches Tolerance Figure 2 Tolerance Vs No. of epochs 93
8 No. of hidden layers Vs Epoches Epochs No. of nodes in Hidden layer Figure 3 No. of Nodes in H.L Vs Epochs 4.0 ANALYSIS OF RESULTS As the training tolerance decreases, the number of epochs needed to learn the pattern (input data) is more. Because, the RMS error allowed in convergence of the networ is very small and to achieve that, the networ has to redistribute the error bac through bac-propagation algorithm. As the training tolerance decreases the prediction capability of the networ will increase, but it taes more time for learning As discussed earlier, the minimum number of nodes in hidden layer should be 12 in this case. To verify the effect of the number of nodes on training epochs, the experiment was done at various values of number of nodes and the results are presented in Table 2. As the number of nodes increases the training epochs also increases above and below 12. This means that in 12 dimensional space the input variables are linearly separable. Going beyond 12 is unwanted tas and going below 12 nodes leads to a lower dimensional space where the input variables are not linearly separable 94
9 Networ Performance e l u a V d te i c d r e P r o tw e N Expected value Figure 4 Networ performance One should note that the neural networ would give results based on the weights. That means, while the near values of the data used for training are given as input, the networ will predict the same value used during the training. For example, using EDM process we can achieve up to 0.03 mm tolerance. Using this data networ was trained. If an input of mm is given as tolerance needed, then the networ will possibly predict EDM as the suitable process provided all other parameters are close to the training data. Actually, using EDM we cannot achieve a tolerance of mm. So, The networ can only be used as an aid for maing decision and designer has to chec the result for practical application. This issue can be solved by an expert system [6], but, when more than one process satisfy the given specification the expert system fail to prioritize the process. The neural networ designed does an additional ob of prioritizing also. In this point of view, the networ was found to be better and the accuracy of the results also matches most of the time to real world solutions. 5.0 CONCLUSION This investigation highlights the use of neural networ in NTM process selection. The results are very encouraging. There is a need for further studies to carried out in order to utilize it effectively for NTM process selection application. 95
10 REFERENCES: [1] Benidict G.F, Nontraditional manufacturing process, Marcel Deer, Inc., New Yor, [2] Can Cogun, Computer-Aided Preliminary Selection of Nontraditional machining Process, Int. J. Mech. Tools Manufact., vol. 34. No. 3, (1994), [3] P.Venateswara Rao, CH. Nagarau, CH. V.V.RamaRao, Computer- Aided Selection of Unconventional Machining Process, 17 th AIMTD,REC,Warangal. [4] Zurada M.J, Introduction to Artificial neural systems, Jaico Publishing House, [5] Production technology by HMT. [6] V.Sugumaran, M.K.Prabaaran, Expert System for Nontraditional Machining, Proceedings of national conference at Annamalai university, (2002). 96
Dept. of Computing Science & Math
Lecture 4: Multi-Laer Perceptrons 1 Revie of Gradient Descent Learning 1. The purpose of neural netor training is to minimize the output errors on a particular set of training data b adusting the netor
More informationLECTURE NOTES Professor Anita Wasilewska NEURAL NETWORKS
LECTURE NOTES Professor Anita Wasilewska NEURAL NETWORKS Neural Networks Classifier Introduction INPUT: classification data, i.e. it contains an classification (class) attribute. WE also say that the class
More informationImage Compression: An Artificial Neural Network Approach
Image Compression: An Artificial Neural Network Approach Anjana B 1, Mrs Shreeja R 2 1 Department of Computer Science and Engineering, Calicut University, Kuttippuram 2 Department of Computer Science and
More informationUse of Artificial Neural Networks to Investigate the Surface Roughness in CNC Milling Machine
Use of Artificial Neural Networks to Investigate the Surface Roughness in CNC Milling Machine M. Vijay Kumar Reddy 1 1 Department of Mechanical Engineering, Annamacharya Institute of Technology and Sciences,
More informationAn Algorithm For Training Multilayer Perceptron (MLP) For Image Reconstruction Using Neural Network Without Overfitting.
An Algorithm For Training Multilayer Perceptron (MLP) For Image Reconstruction Using Neural Network Without Overfitting. Mohammad Mahmudul Alam Mia, Shovasis Kumar Biswas, Monalisa Chowdhury Urmi, Abubakar
More informationSupervised Learning in Neural Networks (Part 2)
Supervised Learning in Neural Networks (Part 2) Multilayer neural networks (back-propagation training algorithm) The input signals are propagated in a forward direction on a layer-bylayer basis. Learning
More informationCOMPUTATIONAL INTELLIGENCE
COMPUTATIONAL INTELLIGENCE Fundamentals Adrian Horzyk Preface Before we can proceed to discuss specific complex methods we have to introduce basic concepts, principles, and models of computational intelligence
More informationCHAPTER 6 COUNTER PROPAGATION NEURAL NETWORK IN GAIT RECOGNITION
75 CHAPTER 6 COUNTER PROPAGATION NEURAL NETWORK IN GAIT RECOGNITION 6.1 INTRODUCTION Counter propagation network (CPN) was developed by Robert Hecht-Nielsen as a means to combine an unsupervised Kohonen
More informationClassification Lecture Notes cse352. Neural Networks. Professor Anita Wasilewska
Classification Lecture Notes cse352 Neural Networks Professor Anita Wasilewska Neural Networks Classification Introduction INPUT: classification data, i.e. it contains an classification (class) attribute
More informationCHAPTER 3 RESEARCH METHODOLOGY
CHAPTER 3 RESEARCH METHODOLOGY 3.1 Introduction This chapter discusses the methodology that is used in this study. The first section describes the steps involve, follows by dataset representation. The
More informationPattern Classification Algorithms for Face Recognition
Chapter 7 Pattern Classification Algorithms for Face Recognition 7.1 Introduction The best pattern recognizers in most instances are human beings. Yet we do not completely understand how the brain recognize
More informationAn Edge Detection Method Using Back Propagation Neural Network
RESEARCH ARTICLE OPEN ACCESS An Edge Detection Method Using Bac Propagation Neural Netor Ms. Utarsha Kale*, Dr. S. M. Deoar** *Department of Electronics and Telecommunication, Sinhgad Institute of Technology
More informationKINEMATIC ANALYSIS OF ADEPT VIPER USING NEURAL NETWORK
Proceedings of the National Conference on Trends and Advances in Mechanical Engineering, YMCA Institute of Engineering, Faridabad, Haryana., Dec 9-10, 2006. KINEMATIC ANALYSIS OF ADEPT VIPER USING NEURAL
More informationESTIMATION OF SUBSURFACE QANATS DEPTH BY MULTI LAYER PERCEPTRON NEURAL NETWORK VIA MICROGRAVITY DATA
Advances in Geosciences Vol. 20: Solid Earth (2008) Ed. Kenji Satake c World Scientific Publishing Company ESTIMATION OF SUBSURFACE QANATS DEPTH BY MULTI LAYER PERCEPTRON NEURAL NETWORK VIA MICROGRAVITY
More informationKeywords: ANN; network topology; bathymetric model; representability.
Proceedings of ninth International Conference on Hydro-Science and Engineering (ICHE 2010), IIT Proceedings Madras, Chennai, of ICHE2010, India. IIT Madras, Aug 2-5,2010 DETERMINATION OF 2 NETWORK - 5
More informationInternational Journal of Advanced Research in Computer Science and Software Engineering
Volume 3, Issue 4, April 203 ISSN: 77 2X International Journal of Advanced Research in Computer Science and Software Engineering Research Paper Available online at: www.ijarcsse.com Stock Market Prediction
More informationCMPT 882 Week 3 Summary
CMPT 882 Week 3 Summary! Artificial Neural Networks (ANNs) are networks of interconnected simple units that are based on a greatly simplified model of the brain. ANNs are useful learning tools by being
More informationTraffic Signs Recognition using HP and HOG Descriptors Combined to MLP and SVM Classifiers
Traffic Signs Recognition using HP and HOG Descriptors Combined to MLP and SVM Classifiers A. Salhi, B. Minaoui, M. Fakir, H. Chakib, H. Grimech Faculty of science and Technology Sultan Moulay Slimane
More informationReview on Methods of Selecting Number of Hidden Nodes in Artificial Neural Network
Available Online at www.ijcsmc.com International Journal of Computer Science and Mobile Computing A Monthly Journal of Computer Science and Information Technology IJCSMC, Vol. 3, Issue. 11, November 2014,
More informationOptimizing Number of Hidden Nodes for Artificial Neural Network using Competitive Learning Approach
Available Online at www.ijcsmc.com International Journal of Computer Science and Mobile Computing A Monthly Journal of Computer Science and Information Technology IJCSMC, Vol. 4, Issue. 5, May 2015, pg.358
More informationSimulation of Back Propagation Neural Network for Iris Flower Classification
American Journal of Engineering Research (AJER) e-issn: 2320-0847 p-issn : 2320-0936 Volume-6, Issue-1, pp-200-205 www.ajer.org Research Paper Open Access Simulation of Back Propagation Neural Network
More informationCHAPTER 7 MASS LOSS PREDICTION USING ARTIFICIAL NEURAL NETWORK (ANN)
128 CHAPTER 7 MASS LOSS PREDICTION USING ARTIFICIAL NEURAL NETWORK (ANN) Various mathematical techniques like regression analysis and software tools have helped to develop a model using equation, which
More informationThe Prediction of Real estate Price Index based on Improved Neural Network Algorithm
, pp.0-5 http://dx.doi.org/0.457/astl.05.8.03 The Prediction of Real estate Price Index based on Improved Neural Netor Algorithm Huan Ma, Ming Chen and Jianei Zhang Softare Engineering College, Zhengzhou
More informationA *69>H>N6 #DJGC6A DG C<>C::G>C<,8>:C8:H /DA 'D 2:6G, ()-"&"3 -"(' ( +-" " " % '.+ % ' -0(+$,
The structure is a very important aspect in neural network design, it is not only impossible to determine an optimal structure for a given problem, it is even impossible to prove that a given structure
More informationMultilayer Feed-forward networks
Multi Feed-forward networks 1. Computational models of McCulloch and Pitts proposed a binary threshold unit as a computational model for artificial neuron. This first type of neuron has been generalized
More informationAssignment # 5. Farrukh Jabeen Due Date: November 2, Neural Networks: Backpropation
Farrukh Jabeen Due Date: November 2, 2009. Neural Networks: Backpropation Assignment # 5 The "Backpropagation" method is one of the most popular methods of "learning" by a neural network. Read the class
More informationChannel Performance Improvement through FF and RBF Neural Network based Equalization
Channel Performance Improvement through FF and RBF Neural Network based Equalization Manish Mahajan 1, Deepak Pancholi 2, A.C. Tiwari 3 Research Scholar 1, Asst. Professor 2, Professor 3 Lakshmi Narain
More information4.12 Generalization. In back-propagation learning, as many training examples as possible are typically used.
1 4.12 Generalization In back-propagation learning, as many training examples as possible are typically used. It is hoped that the network so designed generalizes well. A network generalizes well when
More informationLearning. Learning agents Inductive learning. Neural Networks. Different Learning Scenarios Evaluation
Learning Learning agents Inductive learning Different Learning Scenarios Evaluation Slides based on Slides by Russell/Norvig, Ronald Williams, and Torsten Reil Material from Russell & Norvig, chapters
More informationEffect of Hidden Layer Neurons on the Classification of Optical Character Recognition Typed Arabic Numerals
Journal of Computer Science (7): 578-58, 008 ISSN 59-66 008 Science Publications Effect of Hidden Layer Neurons on the Classification of Optical Character Recognition Typed Arabic Numerals Nidal F. Shilbayeh
More informationCHAPTER VI BACK PROPAGATION ALGORITHM
6.1 Introduction CHAPTER VI BACK PROPAGATION ALGORITHM In the previous chapter, we analysed that multiple layer perceptrons are effectively applied to handle tricky problems if trained with a vastly accepted
More information6. NEURAL NETWORK BASED PATH PLANNING ALGORITHM 6.1 INTRODUCTION
6 NEURAL NETWORK BASED PATH PLANNING ALGORITHM 61 INTRODUCTION In previous chapters path planning algorithms such as trigonometry based path planning algorithm and direction based path planning algorithm
More informationArtificial Neural Network and Multi-Response Optimization in Reliability Measurement Approximation and Redundancy Allocation Problem
International Journal of Mathematics and Statistics Invention (IJMSI) E-ISSN: 2321 4767 P-ISSN: 2321-4759 Volume 4 Issue 10 December. 2016 PP-29-34 Artificial Neural Network and Multi-Response Optimization
More informationFace Detection by Fine Tuning the Gabor Filter Parameter
Suraj Praash Sahu et al, / (IJCSIT) International Journal of Computer Science and Information Technologies, Vol (6), 011, 719-74 Face Detection by Fine Tuning the Gabor Filter Parameter Suraj Praash Sahu,
More informationCS6220: DATA MINING TECHNIQUES
CS6220: DATA MINING TECHNIQUES Image Data: Classification via Neural Networks Instructor: Yizhou Sun yzsun@ccs.neu.edu November 19, 2015 Methods to Learn Classification Clustering Frequent Pattern Mining
More informationOptimization of Cutting Parameters for Milling Operation using Genetic Algorithm technique through MATLAB
International Journal for Ignited Minds (IJIMIINDS) Optimization of Cutting Parameters for Milling Operation using Genetic Algorithm technique through MATLAB A M Harsha a & Ramesh C G c a PG Scholar, Department
More informationDr. Qadri Hamarsheh Supervised Learning in Neural Networks (Part 1) learning algorithm Δwkj wkj Theoretically practically
Supervised Learning in Neural Networks (Part 1) A prescribed set of well-defined rules for the solution of a learning problem is called a learning algorithm. Variety of learning algorithms are existing,
More informationLecture 20: Neural Networks for NLP. Zubin Pahuja
Lecture 20: Neural Networks for NLP Zubin Pahuja zpahuja2@illinois.edu courses.engr.illinois.edu/cs447 CS447: Natural Language Processing 1 Today s Lecture Feed-forward neural networks as classifiers simple
More informationINVESTIGATING DATA MINING BY ARTIFICIAL NEURAL NETWORK: A CASE OF REAL ESTATE PROPERTY EVALUATION
http:// INVESTIGATING DATA MINING BY ARTIFICIAL NEURAL NETWORK: A CASE OF REAL ESTATE PROPERTY EVALUATION 1 Rajat Pradhan, 2 Satish Kumar 1,2 Dept. of Electronics & Communication Engineering, A.S.E.T.,
More informationFast Learning for Big Data Using Dynamic Function
IOP Conference Series: Materials Science and Engineering PAPER OPEN ACCESS Fast Learning for Big Data Using Dynamic Function To cite this article: T Alwajeeh et al 2017 IOP Conf. Ser.: Mater. Sci. Eng.
More informationWHAT TYPE OF NEURAL NETWORK IS IDEAL FOR PREDICTIONS OF SOLAR FLARES?
WHAT TYPE OF NEURAL NETWORK IS IDEAL FOR PREDICTIONS OF SOLAR FLARES? Initially considered for this model was a feed forward neural network. Essentially, this means connections between units do not form
More informationEfficient Object Tracking Using K means and Radial Basis Function
Efficient Object Tracing Using K means and Radial Basis Function Mr. Pradeep K. Deshmuh, Ms. Yogini Gholap University of Pune Department of Post Graduate Computer Engineering, JSPM S Rajarshi Shahu College
More informationTHE NEURAL NETWORKS: APPLICATION AND OPTIMIZATION APPLICATION OF LEVENBERG-MARQUARDT ALGORITHM FOR TIFINAGH CHARACTER RECOGNITION
International Journal of Science, Environment and Technology, Vol. 2, No 5, 2013, 779 786 ISSN 2278-3687 (O) THE NEURAL NETWORKS: APPLICATION AND OPTIMIZATION APPLICATION OF LEVENBERG-MARQUARDT ALGORITHM
More informationLiquefaction Analysis in 3D based on Neural Network Algorithm
Liquefaction Analysis in 3D based on Neural Network Algorithm M. Tolon Istanbul Technical University, Turkey D. Ural Istanbul Technical University, Turkey SUMMARY: Simplified techniques based on in situ
More informationMODELLING AND OPTIMIZATION OF WIRE EDM PROCESS PARAMETERS
MODELLING AND OPTIMIZATION OF WIRE EDM PROCESS PARAMETERS K. Kumar 1, R. Ravikumar 2 1 Research Scholar, Department of Mechanical Engineering, Anna University, Chennai, Tamilnadu, (India) 2 Professor,
More informationData Mining. Neural Networks
Data Mining Neural Networks Goals for this Unit Basic understanding of Neural Networks and how they work Ability to use Neural Networks to solve real problems Understand when neural networks may be most
More informationNeural Networks. CE-725: Statistical Pattern Recognition Sharif University of Technology Spring Soleymani
Neural Networks CE-725: Statistical Pattern Recognition Sharif University of Technology Spring 2013 Soleymani Outline Biological and artificial neural networks Feed-forward neural networks Single layer
More informationMore on Learning. Neural Nets Support Vectors Machines Unsupervised Learning (Clustering) K-Means Expectation-Maximization
More on Learning Neural Nets Support Vectors Machines Unsupervised Learning (Clustering) K-Means Expectation-Maximization Neural Net Learning Motivated by studies of the brain. A network of artificial
More informationNeural Network Classifier for Isolated Character Recognition
Neural Network Classifier for Isolated Character Recognition 1 Ruby Mehta, 2 Ravneet Kaur 1 M.Tech (CSE), Guru Nanak Dev University, Amritsar (Punjab), India 2 M.Tech Scholar, Computer Science & Engineering
More information11/14/2010 Intelligent Systems and Soft Computing 1
Lecture 7 Artificial neural networks: Supervised learning Introduction, or how the brain works The neuron as a simple computing element The perceptron Multilayer neural networks Accelerated learning in
More informationNeural Network Learning. Today s Lecture. Continuation of Neural Networks. Artificial Neural Networks. Lecture 24: Learning 3. Victor R.
Lecture 24: Learning 3 Victor R. Lesser CMPSCI 683 Fall 2010 Today s Lecture Continuation of Neural Networks Artificial Neural Networks Compose of nodes/units connected by links Each link has a numeric
More informationAnalysis of the Social Community Based on the Network Growing Model in Open Source Software Community
Analysis of the Social Community Based on the Networ Growing Model in Open Source Software Community arxiv:8.8v [cs.ma] 9 Apr 8 Taumi Ichimura Department of Management and Systems, Prefectural University
More informationNeural Network Approach for Automatic Landuse Classification of Satellite Images: One-Against-Rest and Multi-Class Classifiers
Neural Network Approach for Automatic Landuse Classification of Satellite Images: One-Against-Rest and Multi-Class Classifiers Anil Kumar Goswami DTRL, DRDO Delhi, India Heena Joshi Banasthali Vidhyapith
More informationLinear Separability. Linear Separability. Capabilities of Threshold Neurons. Capabilities of Threshold Neurons. Capabilities of Threshold Neurons
Linear Separability Input space in the two-dimensional case (n = ): - - - - - - w =, w =, = - - - - - - w = -, w =, = - - - - - - w = -, w =, = Linear Separability So by varying the weights and the threshold,
More informationAn Intelligent Technique for Image Compression
An Intelligent Technique for Image Compression Athira Mayadevi Somanathan 1, V. Kalaichelvi 2 1 Dept. Of Electronics and Communications Engineering, BITS Pilani, Dubai, U.A.E. 2 Dept. Of Electronics and
More informationLecture #11: The Perceptron
Lecture #11: The Perceptron Mat Kallada STAT2450 - Introduction to Data Mining Outline for Today Welcome back! Assignment 3 The Perceptron Learning Method Perceptron Learning Rule Assignment 3 Will be
More informationTraining of Neural Networks. Q.J. Zhang, Carleton University
Training of Neural Networks Notation: x: input of the original modeling problem or the neural network y: output of the original modeling problem or the neural network w: internal weights/parameters of
More informationWeek 3: Perceptron and Multi-layer Perceptron
Week 3: Perceptron and Multi-layer Perceptron Phong Le, Willem Zuidema November 12, 2013 Last week we studied two famous biological neuron models, Fitzhugh-Nagumo model and Izhikevich model. This week,
More informationA Data Classification Algorithm of Internet of Things Based on Neural Network
A Data Classification Algorithm of Internet of Things Based on Neural Network https://doi.org/10.3991/ijoe.v13i09.7587 Zhenjun Li Hunan Radio and TV University, Hunan, China 278060389@qq.com Abstract To
More informationNEURAL MODEL FOR ABRASIVE WATER JET CUTTING MACHINE
Nonconventional Technologies Review Romania, June, 2013 2013 Romanian Association of Nonconventional Technologies NEURAL MODEL FOR ABRASIVE WATER JET CUTTING MACHINE Ciupan Cornel 1, Ciupan Emilia 2 and
More informationFAST NEURAL NETWORK ALGORITHM FOR SOLVING CLASSIFICATION TASKS
Virginia Commonwealth University VCU Scholars Compass Theses and Dissertations Graduate School 2012 FAST NEURAL NETWORK ALGORITHM FOR SOLVING CLASSIFICATION TASKS Noor Albarakati Virginia Commonwealth
More informationExercise: Training Simple MLP by Backpropagation. Using Netlab.
Exercise: Training Simple MLP by Backpropagation. Using Netlab. Petr Pošík December, 27 File list This document is an explanation text to the following script: demomlpklin.m script implementing the beckpropagation
More informationMachine Learning Classifiers and Boosting
Machine Learning Classifiers and Boosting Reading Ch 18.6-18.12, 20.1-20.3.2 Outline Different types of learning problems Different types of learning algorithms Supervised learning Decision trees Naïve
More informationA System for Joining and Recognition of Broken Bangla Numerals for Indian Postal Automation
A System for Joining and Recognition of Broken Bangla Numerals for Indian Postal Automation K. Roy, U. Pal and B. B. Chaudhuri CVPR Unit; Indian Statistical Institute, Kolkata-108; India umapada@isical.ac.in
More informationIntroduction to Neural Networks: Structure and Training
Introduction to Neural Networks: Structure and Training Qi-Jun Zhang Department of Electronics Carleton University, Ottawa, ON, Canada A Quick Illustration Example: Neural Network Model for Delay Estimation
More informationNeural Network Regressions with Fuzzy Clustering
Proceedings of the World Congress on Engineering 007 Vol I WCE 007, July - 4, 007, London, U.. Neural Networ Regressions with Fuzzy Clustering S. I. Ao, Member, IAENG Abstract A hybrid neural networ regression
More informationVisual object classification by sparse convolutional neural networks
Visual object classification by sparse convolutional neural networks Alexander Gepperth 1 1- Ruhr-Universität Bochum - Institute for Neural Dynamics Universitätsstraße 150, 44801 Bochum - Germany Abstract.
More informationVolume 1, Issue 3 (2013) ISSN International Journal of Advance Research and Innovation
Application of ANN for Prediction of Surface Roughness in Turning Process: A Review Ranganath M S *, Vipin, R S Mishra Department of Mechanical Engineering, Dehli Technical University, New Delhi, India
More informationSimultaneous Perturbation Stochastic Approximation Algorithm Combined with Neural Network and Fuzzy Simulation
.--- Simultaneous Perturbation Stochastic Approximation Algorithm Combined with Neural Networ and Fuzzy Simulation Abstract - - - - Keywords: Many optimization problems contain fuzzy information. Possibility
More informationCOMP 551 Applied Machine Learning Lecture 14: Neural Networks
COMP 551 Applied Machine Learning Lecture 14: Neural Networks Instructor: (jpineau@cs.mcgill.ca) Class web page: www.cs.mcgill.ca/~jpineau/comp551 Unless otherwise noted, all material posted for this course
More informationMulti-Layered Perceptrons (MLPs)
Multi-Layered Perceptrons (MLPs) The XOR problem is solvable if we add an extra node to a Perceptron A set of weights can be found for the above 5 connections which will enable the XOR of the inputs to
More informationNeural Networks Laboratory EE 329 A
Neural Networks Laboratory EE 329 A Introduction: Artificial Neural Networks (ANN) are widely used to approximate complex systems that are difficult to model using conventional modeling techniques such
More informationPradeep Kumar J, Giriprasad C R
ISSN: 78 7798 Investigation on Application of Fuzzy logic Concept for Evaluation of Electric Discharge Machining Characteristics While Machining Aluminium Silicon Carbide Composite Pradeep Kumar J, Giriprasad
More informationII. ARTIFICIAL NEURAL NETWORK
Applications of Artificial Neural Networks in Power Systems: A Review Harsh Sareen 1, Palak Grover 2 1, 2 HMR Institute of Technology and Management Hamidpur New Delhi, India Abstract: A standout amongst
More informationCursive Handwriting Recognition System Using Feature Extraction and Artificial Neural Network
Cursive Handwriting Recognition System Using Feature Extraction and Artificial Neural Network Utkarsh Dwivedi 1, Pranjal Rajput 2, Manish Kumar Sharma 3 1UG Scholar, Dept. of CSE, GCET, Greater Noida,
More informationANN Based Surface Roughness Prediction In Turning Of AA 6351
ANN Based Surface Roughness Prediction In Turning Of AA 6351 Konani M. Naidu 1, Sadineni Rama Rao 2 1, 2 (Department of Mechanical Engineering, SVCET, RVS Nagar, Chittoor-517127, A.P, India) ABSTRACT Surface
More informationGesture Recognition using Neural Networks
Gesture Recognition using Neural Networks Jeremy Smith Department of Computer Science George Mason University Fairfax, VA Email: jsmitq@masonlive.gmu.edu ABSTRACT A gesture recognition method for body
More informationSimulation of Zhang Suen Algorithm using Feed- Forward Neural Networks
Simulation of Zhang Suen Algorithm using Feed- Forward Neural Networks Ritika Luthra Research Scholar Chandigarh University Gulshan Goyal Associate Professor Chandigarh University ABSTRACT Image Skeletonization
More informationFor Monday. Read chapter 18, sections Homework:
For Monday Read chapter 18, sections 10-12 The material in section 8 and 9 is interesting, but we won t take time to cover it this semester Homework: Chapter 18, exercise 25 a-b Program 4 Model Neuron
More informationHYBRID PARTICLE SWARM OPTIMIZATION MULTI LAYER PERCEPTRON FOR WEB-SERVICES CLASSIFICATION
International Journal on Information Sciences and Computing Vol. 0 No. July 06 HYBRID PARTICLE SWARM OPTIMIZATION MULTI LAYER PERCEPTRON FOR WEB-SERVICES CLASSIFICATION Abstract A.Syed Mustafa Dr. Y.S.
More informationA Class of Instantaneously Trained Neural Networks
A Class of Instantaneously Trained Neural Networks Subhash Kak Department of Electrical & Computer Engineering, Louisiana State University, Baton Rouge, LA 70803-5901 May 7, 2002 Abstract This paper presents
More informationArgha Roy* Dept. of CSE Netaji Subhash Engg. College West Bengal, India.
Volume 3, Issue 3, March 2013 ISSN: 2277 128X International Journal of Advanced Research in Computer Science and Software Engineering Research Paper Available online at: www.ijarcsse.com Training Artificial
More informationNeural Network Neurons
Neural Networks Neural Network Neurons 1 Receives n inputs (plus a bias term) Multiplies each input by its weight Applies activation function to the sum of results Outputs result Activation Functions Given
More informationCP365 Artificial Intelligence
CP365 Artificial Intelligence Tech News! Apple news conference tomorrow? Tech News! Apple news conference tomorrow? Google cancels Project Ara modular phone Weather-Based Stock Market Predictions? Dataset
More information11/14/2010 Intelligent Systems and Soft Computing 1
Lecture 8 Artificial neural networks: Unsupervised learning Introduction Hebbian learning Generalised Hebbian learning algorithm Competitive learning Self-organising computational map: Kohonen network
More informationArtificial Neuron Modelling Based on Wave Shape
Artificial Neuron Modelling Based on Wave Shape Kieran Greer, Distributed Computing Systems, Belfast, UK. http://distributedcomputingsystems.co.uk Version 1.2 Abstract This paper describes a new model
More informationEdge Detection for Dental X-ray Image Segmentation using Neural Network approach
Volume 1, No. 7, September 2012 ISSN 2278-1080 The International Journal of Computer Science & Applications (TIJCSA) RESEARCH PAPER Available Online at http://www.journalofcomputerscience.com/ Edge Detection
More informationESTIMATING THE COST OF ENERGY USAGE IN SPORT CENTRES: A COMPARATIVE MODELLING APPROACH
ESTIMATING THE COST OF ENERGY USAGE IN SPORT CENTRES: A COMPARATIVE MODELLING APPROACH A.H. Boussabaine, R.J. Kirkham and R.G. Grew Construction Cost Engineering Research Group, School of Architecture
More informationCentral Manufacturing Technology Institute, Bangalore , India,
5 th International & 26 th All India Manufacturing Technology, Design and Research Conference (AIMTDR 2014) December 12 th 14 th, 2014, IIT Guwahati, Assam, India Investigation on the influence of cutting
More informationOptimization Methods for Machine Learning (OMML)
Optimization Methods for Machine Learning (OMML) 2nd lecture Prof. L. Palagi References: 1. Bishop Pattern Recognition and Machine Learning, Springer, 2006 (Chap 1) 2. V. Cherlassky, F. Mulier - Learning
More informationNeural Networks (Overview) Prof. Richard Zanibbi
Neural Networks (Overview) Prof. Richard Zanibbi Inspired by Biology Introduction But as used in pattern recognition research, have little relation with real neural systems (studied in neurology and neuroscience)
More informationMachine Learning written examination
Institutionen för informationstenologi Olle Gällmo Universitetsadjunt Adress: Lägerhyddsvägen 2 Box 337 751 05 Uppsala Machine Learning written examination Friday, June 10, 2011 8 00-13 00 Allowed help
More informationUnit V. Neural Fuzzy System
Unit V Neural Fuzzy System 1 Fuzzy Set In the classical set, its characteristic function assigns a value of either 1 or 0 to each individual in the universal set, There by discriminating between members
More informationA Dendrogram. Bioinformatics (Lec 17)
A Dendrogram 3/15/05 1 Hierarchical Clustering [Johnson, SC, 1967] Given n points in R d, compute the distance between every pair of points While (not done) Pick closest pair of points s i and s j and
More informationIMPLEMENTATION OF FPGA-BASED ARTIFICIAL NEURAL NETWORK (ANN) FOR FULL ADDER. Research Scholar, IIT Kharagpur.
Journal of Analysis and Computation (JAC) (An International Peer Reviewed Journal), www.ijaconline.com, ISSN 0973-2861 Volume XI, Issue I, Jan- December 2018 IMPLEMENTATION OF FPGA-BASED ARTIFICIAL NEURAL
More informationResearch on Evaluation Method of Product Style Semantics Based on Neural Network
Research Journal of Applied Sciences, Engineering and Technology 6(23): 4330-4335, 2013 ISSN: 2040-7459; e-issn: 2040-7467 Maxwell Scientific Organization, 2013 Submitted: September 28, 2012 Accepted:
More informationEnsembles of Neural Networks for Forecasting of Time Series of Spacecraft Telemetry
ISSN 1060-992X, Optical Memory and Neural Networks, 2017, Vol. 26, No. 1, pp. 47 54. Allerton Press, Inc., 2017. Ensembles of Neural Networks for Forecasting of Time Series of Spacecraft Telemetry E. E.
More informationWebsite: HOPEFIELD NETWORK. Inderjeet Singh Behl, Ankush Saini, Jaideep Verma. ID-
International Journal Of Scientific Research And Education Volume 1 Issue 7 Pages 154-162 2013 ISSN (e): 2321-7545 Website: http://ijsae.in HOPEFIELD NETWORK Inderjeet Singh Behl, Ankush Saini, Jaideep
More informationNeuralMachine : neural network tool Version 2.0 (October 2004)
NeuralMachine : neural networ tool Version 2.0 (October 2004) NeuralMachine is a general purpose data-driven modelling tool which runs under Windows operating environment. It maes it possible to solve
More informationOpen Access Self-Growing RBF Neural Network Approach for Semantic Image Retrieval
Send Orders for Reprints to reprints@benthamscience.ae The Open Automation and Control Systems Journal, 2014, 6, 1505-1509 1505 Open Access Self-Growing RBF Neural Networ Approach for Semantic Image Retrieval
More information