Font Size: a A A

Research On Multi-task Learning

Posted on:2019-04-19Degree:DoctorType:Dissertation
Country:ChinaCandidate:Y LiFull Text:PDF
GTID:1318330542997982Subject:Signal and Information Processing
Abstract/Summary:PDF Full Text Request
In machine learning,we often need to model several related tasks,such as face verification,emotion recognition and age regression.All these tasks are essentially intending to learn machine learning models related to faces.However,traditional ma-chine learning methods model each task separately using single-task learning.They usually ignore the intrinsic relatedness between multiple tasks and cannot learn the true distribution of the training data especially when the training data is limited.Conse-quently,valuable information among the training data and model parameters can be missed which leads to performance decrease.Multi-task learning methods are proposed to handle such problems which make reasonable assumption and fully measure the re-latedness between tasks.The performance of these related tasks are supposed to be improved with the information provided by other tasks.In this paper,we mainly focus on multi-task learning and propose some multi-task learning methods according to different scenarios,which aims to improve the perfor-mance of all related tasks.Existing multi-task learning methods are mainly of three cat-egories:feature-based multi-task learning,model-based multi-task learning and deep learning-based multi-task learning.Multi-task learning methods have been demon-strated to achieve better performance than single-task learning methods on various datasets.However,these multi-task learning methods have their own drawbacks.(1)Multi-task learning methods need the training data from several tasks and the effectiveness is con-strained with highly complex computation.(2)Previous multi-task learning only con-sider one aspect of relatedness,relatedness between features or relatedness between model parameters.However,the relatedness found by these methods is sometimes weak.(3)It is difficult for existing multi-task learning methods to extend linear model to non-linear kernel methods.(4)Existing multi-task learning methods only consider the performance of training tasks,however,it is difficult to transfer the learned model to related tasks in the future.In this paper,we propose some novel multi-task learning methods according to the drawbacks of existing methods to alleviate the drawbacks of existing multi-task learn-ing methods.There are mainly four parts:multi-task proximal support vector machine,multi-task model and feature joint learning,eigenfunction-based multi-task learning and extension of multi-task learning.Multi-task learning method based on proximal support vector machine is proposed to solve multi-task problems efficiently which can acceler-ate the running time by ten times and achieve comparable performance.The objective function can be optimized explicitly and the computation related with matrix are accel-erated to improve the efficiency of the algorithm when facing large amount of training data.To solve the problem of incomplete use of relatedness information,we propose to jointly learn shared-features and shared-model parameters which perform better than methods use only shared-features or shared-model parameters.This method measures the relatedness between features and model parameters simultaneously which can im-prove the utilization of task relatedness.The performance of the proposed method is better than just sharing features or sharing model parameters.Additionally,we demon-strate to transfer the non-convex objective function to an equal convex optimization problem which can be easily optimized with high efficiency.We also provide theo-retical analysis on the generalization error bound of the proposed method which can demonstrate the merits of our method theoretically.Additionally,we propose a multi-task learning method based on eigenfunctions which can easily extend linear model to non-linear kernel model.This novel idea can provide the research on multi-task learn-ing from a new perspective.We also demonstrate the effectiveness of this method from both experiments and theoretical analysis.Besides the above standard multi-task learn-ing problems,we discuss multi-task learning on two extended problems.One is to consider the classification problem and feature learning problem jointly based on deep neural networks,which can improve the performance of both classification and feature representation.The other problem is the model transferability to future related tasks,we propose a multi-task learning method based on domain generalization.Related tasks can learn domain invariant features to train models which can have better transferability.We demonstrate the effectiveness of our proposed methods through various experiments and theoretical analysis are provided for some of the methods to demonstrate that our multi-task learning methods have better performance than single-task learning.The contributions and novelty can be summarized as the following:· This paper proposed several multi-task learning algorithms to improve the effi-ciency and effectiveness of multi-task learning according to the drawbacks of ex-isting methods,such as low optimization efficiency,insufficient task relatedness information,and unable to transfer the linear model to nonlinear model easily.· This paper provides both sufficient experimental results and theoretical analysis to demonstrate the effectiveness of our proposed multi-task learning methods.· The eigenfunction-based multi-task learning method is a totally new idea of multi-task learning and can inspire the research on multi-task learning from a new per-spective.· This paper makes an extension of multi-task learning and proposes to learn classi-fication and feature representation jointly.Additionally,we consider the situation in which we need to transfer the learned model to future tasks.
Keywords/Search Tags:Machine Learning, Multi-task Learning, Deep Learning, Domain Generalization, Support Vector Machine, Kernel Methods
PDF Full Text Request
Related items