Font Size: a A A

The Function Approximation And Application Based On Constructive Feedforward Neural Network

Posted on:2010-02-11Degree:DoctorType:Dissertation
Country:ChinaCandidate:M Z HouFull Text:PDF
GTID:1118360305992946Subject:Computer application technology
Abstract/Summary:PDF Full Text Request
It is well known that artificial neural network has very well functional approximation ability. But traditional neural networks based on learning algorithm have many faults, such as sensitive to the initial weights; easily convergence to the local minimum; ofen staying the even area of error gradient surface, convergence very slowly or not convergence; over-fitting and over-training; uncertainty to the number of hidden neurons etc. In this paper, we present some types of single hidden freedforward neural networks, and their ability of functional approximation and the application to prediction on EEG signal, the environmental data forecast on the areas of Changsha, zhuzhou and Xiangtan and the data mining to the stock price data have been studied, and the comparision with the application by use of the learning BP neural networks to the above fields.Firstly, under the elicitation of B.Llanas's study, the activation function of the feedforward neural network was replaced to Gaussian function, then we got the single hidden layer feedforward neural network with Gaussian activation function, we prove that a network with n+1 hidden neurons can interpolate n+1 samples with zero error. Then, the inner and outer weights of theses Gaussian neural networks are constructed according to the samples. We give a proof that they can approximately interpolate, with arbitrary precision, any set of distint data, and the upper bound for the error is presented. Then we prove they can uniformly approximate any continuous function with arbitrary precision, and the upper bound for the error is presented. At the same time, these conclusions are extended to the case of multidimension. The correctness and effectiveness are verified through some numeric experiments.Because of the complexity and difficulty in the practical operation when above Gaussian neural network was extended to the case of multidimension, the activation function of above neural network was replaced to radial basis function (RBF), then, we get the RBF single hidden layer feedforward neural networks. The inner and outer weights in the case of multidimesion are constructed by the samples. The suitable value of shape parameter is suggested, and we give a proof that they can approximately interpolate, with arbitrary precision, any set of distint data, and the upper bound for the error is presented. Further more, we prove they can uniformly approximate any continuous function with arbitrary precision, and the upper bound for the error is presented. The correctness and effectiveness are verified through some numeric experiments.Further, the activation function of the feedforward neural network was replaced to wavlet function and we get the feedforward wavelet neural networks. We prove that the wavelet neural network with n+1 hidden neurons can interpolate n+1 samples with zero error. Then, the inner and outer weights of theses wavelet neural networks are constructed according to the samples. We give a proof that they can approximately interpolate, with arbitrary precision, any set of distint data. Then we prove they can uniformly approximate any continuous function with arbitrary precision. At the same time, these conclusions are extended to the case of multidimension. The correctness and effectiveness are verified through some numeric experiments.For the case of multidimension, using another more simple and more effective constructive method, we get another constructive feedforward RBF neural network, we prove that the neural network with n+1 hidden neurons can interpolate n+1 multidimension samples with zero error. Then we prove they can uniformly approximate any continuous multidimension function with arbitrary precision. The simplicity in operation and faster convergence are verified through some numeric experiments by Matlab programs.The wavelet neural networks were extended to L2(R)RBF neural networks, the same conclusions as wavelet neural network have been got for the case of multidimension data and continous functions. Compared with CRBF,BP. ELM. SVM, L2(R) RBF neural networks have faster convergence rate and better generalization performance.The constructive sigmoid feedforward neural networks are applied to the prediction of EEG singal, compared with the forecasting by BP learning neural networks. We find that the constructive sigmoid feedforward neural networks have obvious advantage in the case of samples which number is not too much.The learning BP feedforward neural networks and constructive wavelet neural networks are appied to the forecasting of the environmental data of the areas of Changsha, zhuzhou and Xiangtan. It is once more verified that the constructive wavelet neural networks have the substantivity, conciseness and stability in the case of prediction of small sample number.On the data mining of stock price data, one part of the data are served as samples, some mathematical models and algorithm such as wavelets, learning BP neural network, and constructive wavelet neural network are used. The other part of the data is served as verification data to check the model and the network. It is once more verified that the constructive wavelet neural networks have the substantivity, conciseness and stability in the case of small sample number.
Keywords/Search Tags:constructive feedforward neural network, wavelet neural network, L~2(R)RBF neural network, interpolation, approximation
PDF Full Text Request
Related items