Font Size: a A A

Multi-task Learning Based On Stochastic Configuration Radial Basis Network And Its Application

Posted on:2022-08-04Degree:MasterType:Thesis
Country:ChinaCandidate:X D KongFull Text:PDF
GTID:2518306458497844Subject:Statistics
Abstract/Summary:PDF Full Text Request
In the field of machine learning,we often need to model several related tasks.The traditional idea models each task separately using single-task learning methods.It ignores the correlation among tasks,loses the information between data and model parameters.Especially when the training data is limited,it is difficult to obtain satisfactory performance.Multi-task learning is proposed to solve such problems in single-task learning,which improves the learning performance by the common information contained in the multiple tasks.How to fully mining the common information and establishing a fast and effective solution model are important problems in multi-task learning.This paper starts from three aspects:(1)A multi-task supervised learning algorithm based on stochastic configuration radial basis network(MTSL-SCRBN)is proposed.It combines the idea of the classical parameters sharing multi-task learning with that of constraint sharing configuration in stochastic configuration networks(SCNs)organically.Namely,the network models of different tasks share the same linear transformation inner weights and the scale parameters of kernels,which are randomly assigned by a shared supervision mechanism.On the basis of making full use of the data information among tasks,the goal of fast solution is achieved.Due to the help of the correlation among multiple tasks,the random configuration conditions of model super parameters in MTSL-SCRBN are weaker than those in the original SCNs algorithm.At the same time,the introduction of radial basis function makes the final model have strong representation ability.Under some reasonable assumptions,this paper gives the convergence proof of the MTSL-SCRBN algorithm.The experimental results on one simulate data set and three real life ones verify that MTSL-SCRBN has a good learning effect and generalization ability when dealing with some multi-task problems with insufficient samples;(2)Multi-task learning methods under high-dimensional data is proposed,which combine the dimensionality reduction theory and the traditional multi-task learning methods.Specifically,such methods solve the problem that the low computational efficiency of traditional multi-task methods when processing high-dimensional data by reducing the original data dimensions,thereby expanding the application areas of multi-task methods.What's more,this paper compares the MTSL-SCRBN under high-dimensional data with the multi-task learning methods based on deep learning,the experimental results on two real life ones verify that the algorithm can solve the problem of low computational efficiency caused by large data dimensions;(3)Explored two extended problems of multi-task learning.One is for this multi-task framework under different activation functions.For the MTSL-SCRBN proposed in this paper,we choose several common activation functions to form multi-task learning methods based on stochastic configuration network to verify the effectiveness of the task relationship learning method combining constraint sharing and parameter sharing;the other is about the impact of data size on the effect of multi-task learning,that is,explore the use scope of multi-task learning methods.This paper compares the learning performance and computational efficiency of the multi-task learning methods and the single-task learning methods on two real life data sets,and suggests that the suitable scope of the multi-task learning methods mentioned in the paper are separated by the "thousand" samples.
Keywords/Search Tags:multi-task learning, stochastic configuration networks, constraint sharing, parameters sharing, dimensionality reduction method
PDF Full Text Request
Related items