APPLICATION OF A MULTI- LAYER PERCEPTRON FOR MASS VALUATION OF REAL ESTATES

Similar documents
MODELLING OF ARTIFICIAL NEURAL NETWORK CONTROLLER FOR ELECTRIC DRIVE WITH LINEAR TORQUE LOAD FUNCTION

A neural network that classifies glass either as window or non-window depending on the glass chemistry.

The comparison of performance by using alternative refrigerant R152a in automobile climate system with different artificial neural network models

Image Compression: An Artificial Neural Network Approach

Early tube leak detection system for steam boiler at KEV power plant

Supervised Learning in Neural Networks (Part 2)

Linear Separability. Linear Separability. Capabilities of Threshold Neurons. Capabilities of Threshold Neurons. Capabilities of Threshold Neurons

Lecture 17: Neural Networks and Deep Learning. Instructor: Saravanan Thirumuruganathan

An Algorithm For Training Multilayer Perceptron (MLP) For Image Reconstruction Using Neural Network Without Overfitting.

Artificial Neural Network (ANN) Approach for Predicting Friction Coefficient of Roller Burnishing AL6061

CHAPTER 4 IMPLEMENTATION OF BACK PROPAGATION ALGORITHM NEURAL NETWORK FOR STEGANALYSIS

THE NEURAL NETWORKS: APPLICATION AND OPTIMIZATION APPLICATION OF LEVENBERG-MARQUARDT ALGORITHM FOR TIFINAGH CHARACTER RECOGNITION

CHAPTER VI BACK PROPAGATION ALGORITHM

NN-GVEIN: Neural Network-Based Modeling of Velocity Profile inside Graft-To-Vein Connection

Planar Robot Arm Performance: Analysis with Feedforward Neural Networks

A Deeper Insight of Neural Network Spatial Interaction Model s Performance Trained with Different Training Algorithms

A Framework of Hyperspectral Image Compression using Neural Networks

RIMT IET, Mandi Gobindgarh Abstract - In this paper, analysis the speed of sending message in Healthcare standard 7 with the use of back

Assignment # 5. Farrukh Jabeen Due Date: November 2, Neural Networks: Backpropation

What is machine learning?

DEEP LEARNING IN PYTHON. The need for optimization

Surface roughness prediction during grinding: A Comparison of ANN and RBFNN models

AN NOVEL NEURAL NETWORK TRAINING BASED ON HYBRID DE AND BP

Recapitulation on Transformations in Neural Network Back Propagation Algorithm

Introduction to Neural Networks: Structure and Training

Neural Networks Laboratory EE 329 A

Neural Networks. CE-725: Statistical Pattern Recognition Sharif University of Technology Spring Soleymani

Performance Enhancement of XML Parsing by using Artificial Neural Network

Optimizing Number of Hidden Nodes for Artificial Neural Network using Competitive Learning Approach

Comparison and Evaluation of Artificial Neural Network (ANN) Training Algorithms in Predicting Soil Type Classification

Training of Neural Networks. Q.J. Zhang, Carleton University

INVESTIGATING DATA MINING BY ARTIFICIAL NEURAL NETWORK: A CASE OF REAL ESTATE PROPERTY EVALUATION

COMPUTATIONAL NEURAL NETWORKS FOR GEOPHYSICAL DATA PROCESSING

COMP 551 Applied Machine Learning Lecture 14: Neural Networks

ARTIFICIAL NEURAL NETWORKS (ANNs) are widely

APPLICATIONS OF INTELLIGENT HYBRID SYSTEMS IN MATLAB

ANN-Based Modeling for Load and Main Steam Pressure Characteristics of a 600MW Supercritical Power Generating Unit

QoS-based Authentication Scheme for Ad Hoc Wireless Networks

Neural Network model for a biped robot

Artificial neural networks are the paradigm of connectionist systems (connectionism vs. symbolism)

CMPT 882 Week 3 Summary

COMBINING NEURAL NETWORKS FOR SKIN DETECTION

Review on Methods of Selecting Number of Hidden Nodes in Artificial Neural Network

The birth of public bus transportation system

PredictioIn the limiting drawing ratio in Deep Drawing process by Artificial Neural Network

An Intelligent Technique for Image Compression

Multi Layer Perceptron trained by Quasi Newton learning rule

Feedback Alignment Algorithms. Lisa Zhang, Tingwu Wang, Mengye Ren

Traffic Signs Recognition using HP and HOG Descriptors Combined to MLP and SVM Classifiers

Hand Writing Numbers detection using Artificial Neural Networks

Dynamic Analysis of Structures Using Neural Networks

Control chart pattern recognition using neural networks and bees algorithm

The Prediction of Real estate Price Index based on Improved Neural Network Algorithm

Perceptrons and Backpropagation. Fabio Zachert Cognitive Modelling WiSe 2014/15

International Research Journal of Computer Science (IRJCS) ISSN: Issue 09, Volume 4 (September 2017)

Code Mania Artificial Intelligence: a. Module - 1: Introduction to Artificial intelligence and Python:

NEW HYBRID LEARNING ALGORITHMS IN ADAPTIVE NEURO FUZZY INFERENCE SYSTEMS FOR CONTRACTION SCOUR MODELING

CHAPTER Introduction I

Load forecasting of G.B. Pant University of Agriculture & Technology, Pantnagar using Artificial Neural Network

A USER-FRIENDLY AUTOMATIC TOOL FOR IMAGE CLASSIFICATION BASED ON NEURAL NETWORKS

Classification of Mammographic Images Using Artificial Neural Networks

COLLABORATIVE AGENT LEARNING USING HYBRID NEUROCOMPUTING

Motivation: Shortcomings of Hidden Markov Model. Ko, Youngjoong. Solution: Maximum Entropy Markov Model (MEMM)

Neural network based Numerical digits Recognization using NNT in Matlab

A VARIANT OF BACK-PROPAGATION ALGORITHM FOR MULTILAYER FEED-FORWARD NETWORK. Anil Ahlawat, Sujata Pandey

Comparison of variable learning rate and Levenberg-Marquardt back-propagation training algorithms for detecting attacks in Intrusion Detection Systems

Experimental Data and Training

Deep Learning Methods Using Levenberg-Marquardt with Weight Compression and Discrete Cosine Transform Spectral Pooling.

Assignment 2. Classification and Regression using Linear Networks, Multilayer Perceptron Networks, and Radial Basis Functions

APPLICATION OF FUZZY REGRESSION MODEL FOR REAL ESTATE PRICE PREDICTION. Faculty of Built Environment, University of Malaya, Kuala Lumpur

6. Backpropagation training 6.1 Background

Back Propagation Neural Network for Classification of IRS-1D Satellite Images

CHAPTER 4 DETERMINATION OF OPTIMAL PATTERN RECOGNITION TECHNIQUE BY SOFTCOMPUTING ANALYSIS

Comparison of Particle Swarm Optimization and Backpropagation Algorithms for Training Feedforward Neural Network

House Price Estimation from Visual and Textual Features

Creating Situational Awareness with Spacecraft Data Trending and Monitoring

Lecture 20: Neural Networks for NLP. Zubin Pahuja

Neural Network Estimator for Electric Field Distribution on High Voltage Insulators

Transactions on Information and Communications Technologies vol 16, 1996 WIT Press, ISSN

Graph Neural Network. learning algorithm and applications. Shujia Zhang

Neural Network Approach for Automatic Landuse Classification of Satellite Images: One-Against-Rest and Multi-Class Classifiers

Learning. Learning agents Inductive learning. Neural Networks. Different Learning Scenarios Evaluation

Accelerating the convergence speed of neural networks learning methods using least squares

Artificial Neural Networks MLP, RBF & GMDH

Notes on Multilayer, Feedforward Neural Networks

Combined Weak Classifiers

Neural Network Estimator for Electric Field Distribution on High Voltage Insulators

Prediction and optimization of load and torque in ring rolling process through development of artificial neural network and evolutionary algorithms

Multi Layer Perceptron trained by Quasi Newton Algorithm or Levenberg-Marquardt Optimization Network

Prediction of False Twist Textured Yarn Properties by Artificial Neural Network Methodology

PERFORMANCE COMPARISON OF BACK PROPAGATION AND RADIAL BASIS FUNCTION WITH MOVING AVERAGE FILTERING AND WAVELET DENOISING ON FETAL ECG EXTRACTION

Spatial Variation of Sea-Level Sea level reconstruction

Sensors & Transducers Published by IFSA Publishing, S. L.,

International Journal of Mechanical Engineering and Technology (IJMET), ISSN 0976 INTERNATIONAL JOURNAL OF MECHANICAL

Practical Tips for using Backpropagation

Channel Performance Improvement through FF and RBF Neural Network based Equalization

Hybrid PSO-SA algorithm for training a Neural Network for Classification

Edge Detection for Dental X-ray Image Segmentation using Neural Network approach

CS6220: DATA MINING TECHNIQUES

The Use of Artificial Neural Networks in Predicting Vertical Displacements of Structures

Transcription:

FIG WORKING WEEK 2008 APPLICATION OF A MULTI- LAYER PERCEPTRON FOR MASS VALUATION OF REAL ESTATES Tomasz BUDZYŃSKI, PhD Artificial neural networks the highly sophisticated modelling technique, which allows project functions of a very high level of complexity

Architecture of the multi-layer perceptron Teaching the multi-layer perceptron software routines which simulate activities of neural networks not programmed, rather taught (trained) using various examples examples: features of real estates, selling prices, the level of rent.

Teaching the multi-layer perceptron processing of input data (examples) successive passes of series of data through the multi-layer perceptron result in adjustment of weights of particular connections and threshold values of the neurone, in such a way, that differences between the results of work of the network (the real estate value) and the expected result (the real estate price) are minimised The multi-layer perceptron design the number of hidden layers the number of neurones in particular layers

Training of the multi-layer perceptron the stage of teaching the stage of testing the stage of analysis of results Teaching the multi-layer perceptron teaching algorithms the back propagation of errors algorithm (BP) the conjugate gradient descent algorithm(cg) the quasi-newton algorithm (QN) the Levenberg-Marquardt algorithm (LM)

Investigations Accuracy of determination of real estate values using the multi-layer perceptron taught by means of four mentioned above teaching algorithms Investigations data of 114 transactions of land, non-built-up parcels, planned for one-family houses, located in Otwock choice of features which in the essential way influence the price level of land on the local real estate market construction of ANN models

Choice of features features location neighbourhood technical infrastructure access to public transport parcel size state of developing methods the genetic algorithm the backward step method Construction of ANN models All transactions were divided into: the teaching subset 71 cases, the validation subset 29 cases, the testing subset 14 cases. Those subsets have similar statistical characteristics: the mean value 60,00 zł/m2 and 60,44 zł/m2, 59,77 zł/m2, the standard deviation values 31,27 zł/m2 and 33,31 zł/m2, 30,35 zł/m2

Construction of ANN models Architecture of a multi-layer perceptron - 3, 4, 5 hidden neurones Teaching algorithms back propagation of errors conjugate gradient descent quasi-newton Levenberg-Marquardt The total of 1200 multi-layer perceptrons were created. Number of hidden neurones Error in teaching subset zł/m 2 Error in validation subset zł/m 2 Error in testing subset zł/m 2 3 9,696718 10,49435 11,76747 BP 97 4 9,643689 10,59871 11,31541 BP 99 5 9,584256 10,36612 11,35798 BP 92 3 8,553579 9,317702 9,744248 CG 99 4 8,896472 9,180777 11,35737 CG 96 5 8,047045 9,097722 10,67246 CG 99 3 8,577321 9,251484 10,44241 QN 92 4 8,996874 9,058687 10,02242 QN 86 5 8,054216 9,437988 10,88689 QN 80 3 7,973481 8,838693 11,05852 LM 95 4 7,460221 8,534605 10,97577 LM 94 5 7,550017 8,587820 9,388863 LM 28 Teaching algorithm, number of teaching epoch

Conclusions Out of investigated teaching algorithms, the Levenberg-Marquardt algorithm allows to construct the best multi-layer perceptron, i.e. the perceptron characterised by the smallest value of the error in the validation subset.

Conclusions The conjugate gradient (CG) and the quasi- Newton algorithms allow to achieve lower accuracy of determination of real estate values. The back propagation of errors algorithms (BP) turned to be characterised by the lowest efficiency, comparing to other investigated algorithms. Conclusions Those conclusions concern the network of architecture of low complexity level, which consist of several input neurons, several hidden neurons and one output neuron, i.e. the networks constructed for the needs of determination of real estate values.

Conclusions Differences in the intensity of teaching the multi-layer perceptron by means of various teaching algorithms for the network of low architectural complexity using about 100 examples become unimportant. Conclusions Increase of the number of hidden neurones in the multi-layer perceptron not always results in decrease of the value of the error of determination of a real estate value. In the case of teaching the neural network using the Levenberg-Marquardt (LM) and the quasi-newton (QN) algorithm the error in the validation subset for the network with 5 hidden neurons was greater than the same error in the network with four neurons in the hidden layer.

Thank you for your attention