, pp.0-5 http://dx.doi.org/0.457/astl.05.8.03 The Prediction of Real estate Price Index based on Improved Neural Netor Algorithm Huan Ma, Ming Chen and Jianei Zhang Softare Engineering College, Zhengzhou University of Light Industry, Zhengzhou 45000, China songge984046@63.com Abstract. This paper proposes a ne approach hich is the combination of hierarchical genetic algorithm and least squares method to optimize the RBF neural netor such that e can predict the real estate price of the Real estate Price Index to conquer the traditional error of the bac-propagation algorithm multilayer feed-forard netor and BP neural netor hich have the flas of slo convergence of forecast, getting local minimum solutions easily, and forecast accuracy rate is not high.. And hich overcomes the shortcomings of traditional Fourier analysis, has good localized characteristics in the time domain and frequency domain, and has important value. Keyords: RBF neural netor; Real estate Price Index prediction; Hierarchical genetic algorithm, Squares method Introduction The real estate is a product of the maret economy. The real estate price is determined by its value, but influenced by economic, political and social many factors. By observing the movement of Real estate price index, people can grasp of the macro economic situation of the real estate maret and the changes in the maret, or from the micro level, people also can tae an analysis according its investment trend, and the realization of these roles based on the analysis and forecast of real estate price index. In recent years, ith the real estate maret continued to heat up, problems about the ris of the real estate industry have become increasingly prominent. Using scientific methods to reflect the changes in real estate prices, and presenting a correct information guide to the main maret has become very urgent. Therefore, in order to improve the accuracy of prediction, this paper presents a ind of method combined the hybrid hierarchical algorithms and RBF neural netor, and builds the fusion model to predict the closing price of the Real estate Price Index (SCI). ISSN: 87-33 ASTL Copyright 05 SERSC
RBF neural netor Radial Basis Function (RBF) neural netor is a feed-forard netor ith good performance, and it can decide the appropriate netor topology based on different issues, ith a high approximation precision, a small-scale of netor training, fast learning speed and non-existence of local minima problems. The structure of RBF neural netor consists of three layers: the input layer, hidden layer and output layer. The structure of topology is shoed in Fig. The input layer The hidden layer x The output layer x x 3 y y x 4 Fig.. RBF neural netor structure These nodes of input layer are only responsible for passing the input signals to the hidden layer, hich including a group of non-linear radial basis function. Gaussian function is generally used as radial basis function, processing the input signal received ith nonlinear transformation, and then transmitting the processed signal to the output layer. Gaussian function formula is given by X C e x p,,,, L n Where φ is the output of the hidden layer, X R is the input of neural netor, C is the center of Gaussian function, σ is the idth of Gaussian function. L is the number of nodes in the hidden layer. The output layer processes signals hich have been processed by the hidden layer ith linear eighted combination. Eventually, the predicted value that is obtained through the RBF neural netor processing is one-dimensional vector y : y,,, L i L () i Where i is the connection eight beteen the th output of hidden layer and the i th neuron of output layer. () Copyright 05 SERSC
3 The prediction model of RBF neutral netor optimized by HHGA Hierarchical Genetic Algorithm (HGA) is a novel genetic algorithm introduced in recent years. For the case of determining the number of hidden nodes, traditional genetic algorithm optimizes the based function s center and idth of hidden layer of RBF neutral netor. The HGA is a learning algorithm proposed for RBF neural netor features regarding the RBF netor topology, center and variance of the hidden layer of basis function, and the connection eight from the hidden layer to the output layer are optimized simultaneously together. In the HGA, each chromosome consists of control genes using binary coding genes and parameter genes using realcoded genes, hich is introduced according to the hierarchical structure of biological chromosome. Control genes use binary coding, binary indicates that its corresponding parameter gene is activated; hence the node of hidden layer occurs. 0 indicates that its corresponding parameter gene is in dormant or inactive state hich implies the node of hidden layer does not exist. Each control gene corresponds to a set of parameter genes in sequence. A set of parameter genes contain the center C i and idth σ i of radial basis function of a hidden layer, and the eight i of the output layer. The hierarchical chromosome structure of RBF neutral netor hich is optimized by HGA is given belo. 0 C σ C σ C n σ n n Fig.. The hierarchical chromosome structure of RBF neutral netor optimized by HGA Hoever, in the process of learning, the convergent speed of algorithm is quite slo. Since the output layer of RBF neural netor is a linear neuron, hence if it is in the case that determining the center C i and idth σ i, the eight i of output layer can be calculated by the least squares method to improve efficiency of the netor's training. The hierarchical chromosome structure of RBF neutral netor hich is optimized by Hybrid Hierarchy Genetic Algorithm (HHGA) that is the combination of HGA and the least squares method is presented belo. 0 C σ C σ Cn σn Fig. 3. The hierarchical chromosome structure of RBF neutral netor optimized by HHGA Chromosome coding Each chromosome uses the hybrid encoding combining binary coding ith real coding. The control gene using binary coding is in the upper layer, indicates the Copyright 05 SERSC
number of hidden layer nodes. Using real-coded parameter gene is in the loer layer, including parameters of hidden layer: the center C i and idth σi of radial basis function. 4 Experimental process Step : obtain eight value and sequence of factors. There are lots of subective eighting methods, such as Gulin method, AHP method, expert investment method, analytic hierarchy process (AHP).in this paper, e use relative eight method. r is the score of i-th factor in the -th questionnaire, then the original eight value of i-th factor in -th investigation equals: r n r i (3) In formula (), n is the number of affected factors in -th investigation. So, for n i each investigation,. Weight value, defined by different experts ith variable acnoledgement toards each influence factors, may not be the same. in order to eliminate the deviation, e average from all questionnaire: ' m m (4) We get the final eight value of each influence factor by relative eight method. At the same time, the sequence about factors can be decided according to the eight value. Step : program ith MATLAB, build BP netor, bring in MIV. Gets the sequence of influence factors using engineering data to train netor When using BP netor, e should first provide a training set, here each sample is determined by the input mode and the desired output mode. Suppose there are q sample training set, then the sample can be formalized as a model of ( x, x,..., x n ) ( y, y,..., y n ) X, Y,,..., q ( ),, X is the -th input, Y is its output, and they correspond to n neurons of the input layer and the output layer, respectively. When the actual output of the netor and the desired output are consistent, the learning process is over. Otherise, the learning algorithm ill mae the actual output close to the desired by adusting the connection eights of the netor, according to the error beteen the to outputs. Define hi as a connection eight from the neurons h in the input layer to the neurons i in the hidden layer, and define i as a connection eight from the Copyright 05 SERSC 3
neurons i in the hidden layer to the neurons in the output layer. Suppose for a ( X, Y ), the global error function in the output layer is: m E ( ),,... y y q y i Among them, is the actual output of output layer neurons, output. Then, error of the hole training set is: q E E For the -th sample, the eighted input of output layer neurons is: i n ety b * p i (p is the number of neurons in the hidden layer) Then the actual output is: In formula (8), y f ( n e ty ) f ( u) u e (5) is the desired (9) Among them, s means the number of iterations. Thus, e can obtain the ran of the influence factors and the eight value. (6) (7) (8) 5 Conclusion A series of uncertain factors in real estate maret have led to the difficulty of predicting real estate price, such as policy, economic environment and investor psychology. From numerous approaches studying neural netor to predict the real estate price, this paper proposes and implements the combination of HHGA and RBF neural netor to forecast the closing price of SCI. In addition, comparing ith the prediction effect of using RBF neural netor and BP neural netor optimized by GA ith the same experiment, the results confirms the prediction error is smaller using RBF neural netor optimized by HHGA and the prediction accuracy is superior to other to methods. References. Hill, T.O., Connor, M., Remus, W.: Neural netor models for time series forecasts, Management Science, Vol.4 (7) (996), p08 09. 4 Copyright 05 SERSC
. Guresen, E., Kayautlu. G. and Daim, T.: Using artificial neural netor models in real estate maret index prediction, Expert Systems ith Applications, Vol.38 (0), p0389 0397. 3. Armano, G., Marchesi, M. and Murru, A.: A hybrid genetic-neural architecture for real estate indexes forecasting: Information Sciences, Vol.70 (005), p3 33. 4. Hassan, R., Nath. B. and Kirley, M.: A fusion model of HMM, ANN and GA for real estate maret forecasting: Expert Systems ith Applications, Vol.33 (007), p7 80. 5. Lee, M.-C: Using support vector machine ith a hybrid feature selection method to the real estate trend prediction: Expert Systems ith Applications, Vol.36 (8) (009), p0896 0904. Copyright 05 SERSC 5