Implementation of Neural Network with a variant of Turing Machine for Traffic Flow Control

Similar documents
Variants of Turing Machines

CS21 Decidability and Tractability

CS6220: DATA MINING TECHNIQUES

KINEMATIC ANALYSIS OF ADEPT VIPER USING NEURAL NETWORK

Natural Language Processing CS 6320 Lecture 6 Neural Language Models. Instructor: Sanda Harabagiu

Introduction to Computers & Programming

Image Compression: An Artificial Neural Network Approach

Theory of Languages and Automata

CISC 4090 Theory of Computation

Artificial Neural Networks Lecture Notes Part 5. Stephen Lucci, PhD. Part 5

CHAPTER VI BACK PROPAGATION ALGORITHM

MATLAB representation of neural network Outline Neural network with single-layer of neurons. Neural network with multiple-layer of neurons.

Turing Machine as Transducer. Turing Machines. Computation with Turing Machines. Computation with Turing Machines. Computation with Turing Machines

2. Neural network basics

Character Recognition Using Convolutional Neural Networks

ECS 120 Lesson 16 Turing Machines, Pt. 2

Outline. Language Hierarchy

Turing Machines. A transducer is a finite state machine (FST) whose output is a string and not just accept or reject.

Prediction of traffic flow based on the EMD and wavelet neural network Teng Feng 1,a,Xiaohong Wang 1,b,Yunlai He 1,c

Data Mining. Neural Networks

An Algorithm For Training Multilayer Perceptron (MLP) For Image Reconstruction Using Neural Network Without Overfitting.

CS 125 Section #4 RAMs and TMs 9/27/16

International Journal of Scientific Research & Engineering Trends Volume 4, Issue 6, Nov-Dec-2018, ISSN (Online): X

FACE RECOGNITION USING FUZZY NEURAL NETWORK

WHAT TYPE OF NEURAL NETWORK IS IDEAL FOR PREDICTIONS OF SOLAR FLARES?

International Journal of Advanced Research in Computer Science and Software Engineering

Equivalence of NTMs and TMs

Keywords: ANN; network topology; bathymetric model; representability.

Link Lifetime Prediction in Mobile Ad-Hoc Network Using Curve Fitting Method

Knowledge Discovery and Data Mining. Neural Nets. A simple NN as a Mathematical Formula. Notes. Lecture 13 - Neural Nets. Tom Kelsey.

pp Variants of Turing Machines (Sec. 3.2)

An Edge Detection Method Using Back Propagation Neural Network

Knowledge Discovery and Data Mining

Neural Network and Deep Learning. Donglin Zeng, Department of Biostatistics, University of North Carolina

CSE 105 THEORY OF COMPUTATION

Neural-based TCP performance modelling

Identification of Multisensor Conversion Characteristic Using Neural Networks

Turing Machines, continued

NEURAL NETWORK-BASED SEGMENTATION OF TEXTURES USING GABOR FEATURES

CT32 COMPUTER NETWORKS DEC 2015

Simulation of Zhang Suen Algorithm using Feed- Forward Neural Networks

Structural and Syntactic Pattern Recognition

14.1 Encoding for different models of computation

Reification of Boolean Logic

CLASSIFICATION WITH RADIAL BASIS AND PROBABILISTIC NEURAL NETWORKS

CSE 105 THEORY OF COMPUTATION

PIONEER RESEARCH & DEVELOPMENT GROUP

Dynamic Analysis of Structures Using Neural Networks

ISSN: (Online) Volume 3, Issue 9, September 2015 International Journal of Advance Research in Computer Science and Management Studies

SNIWD: Simultaneous Weight Noise Injection With Weight Decay for MLP Training

International Research Journal of Computer Science (IRJCS) ISSN: Issue 09, Volume 4 (September 2017)

Liquefaction Analysis in 3D based on Neural Network Algorithm

Closure Properties of CFLs; Introducing TMs. CS154 Chris Pollett Apr 9, 2007.

THE NEURAL NETWORKS: APPLICATION AND OPTIMIZATION APPLICATION OF LEVENBERG-MARQUARDT ALGORITHM FOR TIFINAGH CHARACTER RECOGNITION

CSE 105 THEORY OF COMPUTATION

Traffic Sign Localization and Classification Methods: An Overview

Global Journal of Engineering Science and Research Management

Design and Performance Analysis of and Gate using Synaptic Inputs for Neural Network Application

Traffic Signs Recognition using HP and HOG Descriptors Combined to MLP and SVM Classifiers

Review on Methods of Selecting Number of Hidden Nodes in Artificial Neural Network

4.12 Generalization. In back-propagation learning, as many training examples as possible are typically used.

ABSTRACT I. INTRODUCTION. Dr. J P Patra 1, Ajay Singh Thakur 2, Amit Jain 2. Professor, Department of CSE SSIPMT, CSVTU, Raipur, Chhattisgarh, India

Dynamic Deferred Acknowledgment Mechanism for Improving the Performance of TCP in Multi-Hop Wireless Networks

Implementation of a Library for Artificial Neural Networks in C

Knowledge-Defined Networking: Towards Self-Driving Networks

Design of Deterministic Finite Automata using Pattern Matching Strategy

A Data Classification Algorithm of Internet of Things Based on Neural Network

Edge Detection for Dental X-ray Image Segmentation using Neural Network approach

Climate Precipitation Prediction by Neural Network

TAFL 1 (ECS-403) Unit- V. 5.1 Turing Machine. 5.2 TM as computer of Integer Function

Cryptography Algorithms using Artificial Neural Network Arijit Ghosh 1 Department of Computer Science St. Xavier s College (Autonomous) Kolkata India

Yuki Osada Andrew Cannon

Transactions on Information and Communications Technologies vol 19, 1997 WIT Press, ISSN

Discrete Mathematics and Probability Theory Fall 2013 Vazirani Note 7

Distributed minimum spanning tree problem

SECTION A (40 marks)

COMP 382: Reasoning about algorithms

Argha Roy* Dept. of CSE Netaji Subhash Engg. College West Bengal, India.

Matrix Transformations The position of the corners of this triangle are described by the vectors: 0 1 ] 0 1 ] Transformation:

INVESTIGATING DATA MINING BY ARTIFICIAL NEURAL NETWORK: A CASE OF REAL ESTATE PROPERTY EVALUATION

Improving the Performance of Adaptive Secured Backup Ad-hoc Routing protocol by Artificial Neural Networks in Wireless Ad-hoc Networks

Artificial Neural Networks MLP, RBF & GMDH

World Journal of Engineering Research and Technology WJERT

A DISCUSSION ON SSP STRUCTURE OF PAN, HELM AND CROWN GRAPHS

STUDY OF THE DEVELOPMENT OF THE STRUCTURE OF THE NETWORK OF SOFIA SUBWAY

KeyWords: Image Compression, LBG, ENN, BPNN, FBP.

One type of these solutions is automatic license plate character recognition (ALPR).

Performance of Multihop Communications Using Logical Topologies on Optical Torus Networks

Dynamic Scheduling Based on Simulation of Workflow

Query Learning Based on Boundary Search and Gradient Computation of Trained Multilayer Perceptrons*

11 1. Introductory part In order to follow the contents of this book with full understanding, certain prerequisites from high school mathematics will

Deep Learning. Practical introduction with Keras JORDI TORRES 27/05/2018. Chapter 3 JORDI TORRES

Performance Analysis of Data Mining Classification Techniques

DEVELOPMENT OF NEURAL NETWORK TRAINING METHODOLOGY FOR MODELING NONLINEAR SYSTEMS WITH APPLICATION TO THE PREDICTION OF THE REFRACTIVE INDEX

Theory of Computation, Homework 3 Sample Solution

Hybrid PSO-SA algorithm for training a Neural Network for Classification

CHAPTER 5 GENERATING TEST SCENARIOS AND TEST CASES FROM AN EVENT-FLOW MODEL

International Journal of Advance Engineering and Research Development. Improved OLSR Protocol for VANET

Artificial Neuron Modelling Based on Wave Shape

Opening the Black Box Data Driven Visualizaion of Neural N

Transcription:

Implementation of Neural Network with a variant of Turing Machine for Traffic Flow Control Rashmi Sehrawat Computer Science & Engineering B.S Anangpuria Institute of Technology and Management Faridabad,India Rashmi.sehrawat04@gmail.com Honey Malviya Computer Science & Engineering B.S Anangpuria Institute of Technology and Management Faridabad,India Vanditaa Kaul Computer Science & Engineering B.S Anangpuria Institute of Technology and Management Faridabad,India Abstract-The conventional method of operation of a typical traffic light is to distribute the time equally for all the directions.this method causes congestion when throughput of the signal increases and is also ineffective in managing traffic flow. In this paper, we have proposed a new model for managing traffic intelligently.the model is based on Turing machine with the application of neural network. The model considers current traffic status of its own signal along with the status of its adjacent signals to determine the ratio of time slot for each signal therefore, reducing traffic congestion to a greater extent and ensuring steady flow of traffic in a wide region. Index Terms- Intelligent Traffic Flow Control, Turing Machine, Neural Network, Multilayer Feed-forward Neural Network, Sigmoid Activation Function. I. INTRODUCTION Now a day s traffic congestions are most cities growing problem despite continuing modernization of road infrastructure. Rapid increase of number of vehicles causes rapid increase of road traffic intensity. The road throughput and traffic regularity decreases, travelling grows and what is more the vehicles maintenance costs increases. The proposed model solves the above problems.the model is designed to avoid traffic congestions and minimize the total time travel of the region. It collects the incoming traffic through various source and status of its adjacent signals and allows processing it and utilizing it to improve traffic flow. According to this model, each traffic signal is a Turing Machine, and the control system of this Turing machine is a Feed-Forward Artificial Neural Network.Turing Machine, similar to finite automation but with an unlimited and unrestricted memory is a much more accurate model of general purpose computer. Neural Network methods come under non mathematical models category of algorithms and are supposed to be efficient for solving the complex non-linear prediction of large scale systems. In this paper a model for traffic flow control is introduced based on Turing Machine with application of Neural Networks. The experiment result proves that the introduced model is very effective despite of its complexity. II. TURING MACHINE In 1936, A. M.Turing proposed the Turing machine as a model of any possible computation. Turing machine is an automation whose temporary storage is a tape. This tape is divided into cells. Each cell is capable of holding one input. A read-write head associated with the tape and can move left or right and can read/write a single symbol on each move. TM has a finite control which can be in any of a finite set of states. A move of the Turing machine is a function of the state of the finite control and the tape symbol scanned. In one move, The Turing machine may Change state, write a tape symbol in the cell and move the tape head left or right. Figure 1 shows an instance of Turing machine. ISSN : 0975-3397 Vol. 5 No. 05 May 2013 343

Formally, a Turning Machine is a 7-tuple, such that TM = (Q,, Γ, δ, q 0, B, F) Q is finite set of states. is finite set of Input symbols. Γ is complete set of Tape symbol, is always a subset of Γ. δ: (Q x Γ ) (Q x Γx{L, R}) is the transition function. q 0 is start state, a member of Q. B is blank symbol. This symbol is in Γ but not in. F is set of final or accepting states, a subset of Q. Alternative definitions of Turing Machine abound, including versions with multiple tapes or with nondeterminism. They are called variants of Turing Machine Model. The original model and its reasonable variants all have the same power - they recognize the same class of languages. The variant of Turing Machine we have used is a multi-tape Turing Machine, like an ordinary Turing Machine with several tapes. Each tape has its own head for reading and writing. Description of the variant of Turing Machine used In our model, all the traffic signals are a Turing Machine.Assuming that each traffic signal has N adjacent traffic signals in N directions, each Turing Machine has 2N+1 tape. N tapes for incoming traffic and other N for status of the adjacent traffic signal and a tape for its own status.the machine can be in one of the possible states depending on the traffic signal throughput and magnitude of remaining traffic.the symbols on the tapes are dynamically written based on the vehicle perception in every direction and status symbols of the adjacent machine.the input symbol on the tapes represents the magnitude of the incoming traffic in every direction and status of the adjacent machine.table 1 shows the symbols used to represent the magnitude of traffic on a road. Magnitude of Traffic Symbol Less than 5% A 5% - 10% B 11% - 15% C 16% - 20% D 21% - 25% E 26% - 30% F 31% - 35% G 36% - 40% H Greater than 40% X Table 1 The control system implements neural network which is discussed following this section. The heads of the machine are read/write. All the Turing machines are connected to their adjacent Turing machines to share their current state. Each machine processes inputs from the tapes written dynamically on them to provide ratio of time slot for each direction and share its current status with the adjacent machines. The transition function is dynamically evaluated based on the inputs from the input tapes and following operations are triggered. ISSN : 0975-3397 Vol. 5 No. 05 May 2013 344

1. Neural network based control system determine the ratio of time slots for each direction. 2. Next state of the machine is determined based on the remaining traffic after the calculated time slot finishes. 3. Current state of the machine is shared with the adjacent machines. 4. Machine then waits for perception and start again. III. NEURAL NETWORK Multilayer Feed-forward Neural Networks (MFNNs) are acyclic networks that implement input-output mappings. The basic multilayer Feed-forward Neural Network approximates output data by transforming input data, and consists of an input layer, one or more hidden layers, and an output layer. Each layer contains nodes or neurons. Interconnections within the network are such that neurons in layer i are connected to neurons in layer i+1, that is, each neuron in layer in i is connected to every neuron in the adjacent layer i+1. Each interconnection has a scalar weight associated with it that is adjusted during the training phase. The number of hidden layers determines the number of layers of weights found in the network. In the most frequently invoked MFNN model, each non-input neuron calculates its output by applying a sigmoid function (such as hyperbolic tangent) to its net weighted input. Figure 2 shows the architecture of a two-layer MFNN. The nature of the problem being solved and the complexity of the function being approximated determine the number of input and output neurons, as well as the number of hidden layers. An Artificial neural network is typically defined by three types of parameters: 1. The interconnection pattern between different layers of neurons. 2. The learning process for updating the weights of the interconnections. 3. The activation function weighted input to its output activation. Mathematically, a neuron s network function is defined as a composition of other functions, which can further be defined as a composition of other functions. This can be conveniently represented as a network structure, with arrows depicting the dependencies between variables. These can be interpreted in two ways.the two views are largely equivalent. The first view is the function view: the input is a p-dimensional vector which is transformed into a q- dimensional vector, which is then transformed into r-dimensional vector and so on and is finally transformed into output. This view is most commonly encountered in the context of optimization. The second view is the probabilistic view: the random variable F depends upon the random variable G, which depends on the random variable H, and so on. Therefore the second last random variable in the dependency depends on the random variable X. this view is most commonly encountered in the context of graphical models. Neural Network used in the control system In this model, we have used a three layer Feed-forward neural network. It consists of an input layer, an output layer and two hidden layer. The input layer contains eight nodes, two hidden layers contain four nodes each and output layer contains two nodes. Fig 3 shows a model of neural network. ISSN : 0975-3397 Vol. 5 No. 05 May 2013 345

Fig3. 3 layer Feed-forward neural network The eight input nodes in the network correspond to eight tapes of the Turing machine. The symbols read from the tapes represent the magnitude of traffic in corresponding directions and status of the adjacent traffic signal in that direction.the output O 1 is mapped to the current status of the signal and output O 2 give the ratio of time slot. For activation function, unipolar sigmoid function is used. A sigmoid function can be expressed as The sigmoid activation function only returns positive values in range [0, 1]. Fig 4. Sigmoidal Activation Function IV. PROPOSED WORK This model is simulated on a virtual traffic environment. The environmental behaviour is randomized and all the possible scenarios are simulated. Overall traffic congestion and average time travel is calculated to evaluate the model s parameter such as weights. The simulation of the proposed Turing machine with neural network as its control system is tested against different scenarios. The model is compared with conventional method under same simulated environment conditions and other related models of neural network. Three different locations L 1, L 2 and L 3 consisting five, nine and 15 signals are simulated. The topologies used to in all the location are quite symmetric for providing simplicity to the simulation model. ISSN : 0975-3397 Vol. 5 No. 05 May 2013 346

Fig 5. Virtual Traffic Environment for Location L2 In the above S i represents a traffic signal. Each S i is a proposed Turing machine model with neural network as its control system. These dynamic Turing machines share their status signal or more precisely, their current state with their adjacent machines. Weights of the neural networks are varied and traffic congestion and average time to travel is calculated which is used for evaluation. After a number of iterations, a set of weight is determined. This model minimized congestion and total time travel to cross the region. After these experiments, the model is then tested for locations L 1 and L 3. The weights between the input layer and first hidden layer be denoted by W 1a such that it denotes the weight between first input node and first node in the hidden layer then, we can have W 1a =W 3b =W 5c =W 7d And W 2a =W 4b =W 6c =W 8d And the weights between two hidden layers are denoted as W ap then, we can have W ap = W bq = W cr = W ds And W aq =W a r =W as =W bp =W br = W bs = W cp = W cq = W cs = Wdp =W dq = W dr The weights between output layer and upper hidden layers are equal. We simulated the model on location L 2 on the following set of weights. S. No. W 1a W 2a 1. 0.5 0.5 2. 0.6 0.4 3. 0.7 0.3 4. 0.8 0.2 5. 0.9 0.1 Table 2 ISSN : 0975-3397 Vol. 5 No. 05 May 2013 347

S. No. W ap W aq 1. 0.9-0.1 2. 0.8-0.1 3. 0.7-0.1 4. 0.6-0.1 5. 0.9-0.2 6. 0.8-0.2 7. 0.7-0.2 8. 0.9-0.3 Table 3 V. RESULTS We get following results after simulating this model. The table below represents the traffic congestion in conventional and proposed model. Location Congestion in conventional model Congestion in proposed model L 1 22.53 % to 15.87% 17.83% to 5.90% L 2 38.27% to 7.98% 19.37% to 8.90% L 3 44.89% to 24.46% 22.56% to 7.84% TABLE 4 The above result clearly shows that that total time travel to cover the region is reduced to a greater extent. The proposed model in better than the convention model and minimizes traffic congestion extensively. VI. CONCLUSION The proposed model of a Turing Machine with neural network as its control system can be used to solve a Variety of problems and provide a powerful system. The capabilities of Turing Machine and Neural Network can be used simultaneously and therefore provide us an opportunity to exploit their capacity. The proposed model in an example of it and this model can be extended and modified to solve other problems. VII. References [1] KishanMehrotra, Chilukuri K. Mohan, and Sanjay Ranka, Elements of Artificial Neural Network: MIT Press,1997. [2] P. Webos, Beyond Regression: New Tools for Prediction and Analysis in the Behavioral Sciences (Ph.D., Harvard University, 1974). [3] Artificial Neural Networks by Dr. B. Yegnanarayana. [4] Neural Networks A comprehensive Foundation by Simon Haykin. [5] Anuja Nagare, Shalini Bhatia Traffic Flow Control Using Neural Network in International Journal Of Applied Information System(IJAIS)-ISSN-2249-0868,Foundation Of Computer Science FCS,New York,USA,Volume1-No.2,January 2012. [6] http://www.ijais.org/archives/volume1/number2/68-0115 [7] http://www.cs.uky.edu/~lewis/texts/theory/computability/tur-mach.pdf. AUTHORS First Author Rashmi Sehrawat, Student(M.Tech), B.S.Anangpuria Institute of Technology & Management, rashmi.sehrawat04@gmail.com Second Author Honey Malviya, Student(B.Tech), B.S.Anangpuria Insitute of Technology & Management. hny.ken@gmail.com Third Author Vanditaa Kaul, Lecturer,B.S.Anangpuria Insitute of Technology & Management. vanditaa.kaul@gmail.com ISSN : 0975-3397 Vol. 5 No. 05 May 2013 348