Font Size: a A A

Research Of Four-layer Neural Networks Based On Randomly Mapped Features And Their Multiple Kernel Learning Methods

Posted on:2022-03-25Degree:MasterType:Thesis
Country:ChinaCandidate:Y YangFull Text:PDF
GTID:2518306527482974Subject:Software engineering
Abstract/Summary:PDF Full Text Request
In a machine learning task,feature selection is a key step for sample prediction.When faced with complex unknown domains,it is difficult for us to judge the correlation between features and prediction targets or between features and features.So we generally initialize neural network weights with the strategy of randomly initializing parameters.And a series of algorithms are used to complete feature selection and high-dimensional mapping simultaneously.Effective feature selection can not only reduce feature scale of models,but also prevent from overfitting.Moreover,it can also improve generalization ability,interpretability and training speed of model.Generally,Appropriate feature mapping can also improve the prediction performance of the models.The research work of this paper is as follows:For the problem of multiple iteration trainings and long training time of traditional neural networks,this paper proposes a four-layer neural network based on randomly feature mapping(FRMFNN)and its fast incremental learning algorithm.FRMFNN is a four-layer network model,which based on the Random Vector Functional-Link Neural Network(RVFLNN).Firstly,FRMFNN transformed the original input features into randomly mapping features through a specific random mapping algorithm and stored them in the hidden layer nodes of the first layer.Then,the FRMFNN generated its nodes of second hidden layer using non-linear activation function on all random mapping features.Finally,the second hidden layer was linked to the output layer through the output weights.Since the weights of the first and the second hidden layers were randomly generated according to certain continuous sampling probability distribution,without the need of updates of the weights,and the output weights can be quickly solved by the ridge regression,thus time-consuming training process of the traditional back propagation neural networks was avoided.When FRMFNN can't reach the prescribed accuracy,its performance can be continuously improved by its quickly incremental algorithm to avoid retraining the whole network.In this paper,a detail introduction of proposed FRMFNN and its incremental algorithms was provided,what is more,a proof of universal approximation property of FRMFNN was also given.Experimental results on several popular classification and regression datasets show the effectiveness of FRMFNN and its fast incremental learning algorithm.In order to perform high-dimensional mapping of sample features better,this paper further introduces the kernel method in FRMFNN.Since there is no perfect theoretical basis for the selection of kernel function,and the network nodes of the FRMFNN is excessively large,a Four-layer Multiple Kernel Neural Network based on Randomly Mapped Features(MK-FRMFNN)was proposed.Firstly,the original input features were transformed into randomly mapped features by certain randomly mapping algorithm.Then it generated multiple basic kernel matrices through different randomly kernel mappings.Finally,the combined kernel matrix which formed from basic kernel matrices was linked to the output layer through the output weights.Different random weight matrices were used to the basic kernel mapping of MK-FRMFNN.In this way,the kernel matrix composed of the basic kernel matrices can not only synthesize the advantages of various kernel functions,but also integrate the characteristics of various random distribution functions,so that the data can obtain better feature selection and expression effect in the new feature space.In the comparison experiments with the Broad Learning System(BLS)and the Four-layer Neural Network based on Randomly Mapped Features(FRMFNN),the node size of the MK-FRMFNN model was reduced by about 2/3,and the classification performance was stabl e.The results compared with mainstream multi-kernel models show that the proposed algorithm can learn large sample datasets,and has better performance in classification than most comparison algorithms.
Keywords/Search Tags:Randomly Mapped Feature, Neural Network, Ridge Regression, Incremental Learning, Multi-kernel Learning
PDF Full Text Request
Related items