Font Size: a A A

Compared To The Neural Network Time Series Forecasting And Their Performance Improved Before

Posted on:2004-07-05Degree:MasterType:Thesis
Country:ChinaCandidate:Q DaiFull Text:PDF
GTID:2208360092976025Subject:Computer software and theory
Abstract/Summary:PDF Full Text Request
Time series prediction is an important aspect of dynamic data analysis and processing. There exists the demand of predicting the future based upon historical data in the applications of science,economics and engineering. In the face of the nonlinear and non-stationary time series largely existed in natural and social economical phenomena,the traditional method of statistical analysis performed badly. Since 1987,Lapedes and Farber firstly applied neural networks to time series prediction. Then such a predicting method with neural networks has widely been recognized. Presently,many different types of networks have been proposed and applied to the forecasting in industry and economics. The research results show that neural networks have good performance in prediction,which providing an effective approach for highly nonlinear and dynamic time series forecasting.In this article,we applied our improved circular back-propagation (ICBP) network to single step and multi-steps time series prediction respectively. ICBP is a generalization to circular back-propagation (CBP) network. CBP possesses good generalization ability and adaptability compared with the counterpart BP. And in its frame we can construct vector quantification (VQ) and radial basis function (RBF) networks,showing great flexibility. Keeping the original structure of CBP unchanged,we add an extra node to CBP input level and assign specific values to the weights between that node and the hidden level,and then we obtain a more general model than CBP,which is ICBP. The analysis and simulations all demonstrate the reasonability and superiority of the proposed model.But during predicting,both CBP and ICBP neglected structural changes and correlation in time series themselves. They did not consider how the distance from the observation point to the current predicting point would influence the resulting performance. Discounted least squares principle describes this influence precisely. In this paper the principle is borrowed to construct DLS-ICBP which realizing great improvements in prediction. In addition,in order to compare with RBF,we construct DLS-RBF network as a by-product of this paper.On the other hand,when we use the traditional chained neural networks to predict information p step ahead,the predicting quality declines rapidly with the increment of p. The reason for this phenomenon is that the network inputs did not supply enough information for the future predicting. While the increment of training steps make the information absence become heavier. Duhoux brought forward a new kind of neural network chain and proved through experiments that the new chain improved the effect ofmulti-steps prediction. In this paper we adopt this new chain to construct chained DLS-ICBP network and greatly boost the performance of multi-steps time series prediction.Finally,most supervised learning neural networks train themselves through minimizing mean squared error. But when the neural network models trained in this way are used to do forecasting,the existence of outliers result in great imprecision. We adopt robust method to construct LOG-ICBP network and research time series prediction with outliers. In LOG-ICBP,mean squared error(MSE) is substituted by mean log squared error(MLSE) to be the error criterion. The experiment results show that LOG-ICBP has better predicting effect than ICBP when outliers exist.
Keywords/Search Tags:Neural networks, Time series prediction, Improved circular back-propagation networks, Discounted least squares, Chained neural networks, Multi-steps time series prediction, Outliers, Robustness
PDF Full Text Request
Related items