Font Size: a A A

Research On Multi-Latent Spaces-Based Transfer Learning Algorithms

Posted on:2017-02-23Degree:DoctorType:Dissertation
Country:ChinaCandidate:J H PanFull Text:PDF
GTID:1108330488485163Subject:Computer application technology
Abstract/Summary:PDF Full Text Request
Traditional machine learning methods require the training data and test data are drawn from the same feature space and feature distribution. However, it is invalid in many real-world applications. Therefore, transfer learning has attracted extensive research interests in recent years, because it aims to exploit the knowledge in previous tasks to promote the learning tasks in new domains. To tackle the challenges of different latent distributions, many latent space-based transfer learning methods have been proposed for cross-domain tasks. These methods usually constructed a single shared latent feature space, in which the training and test data are drawn from the same distribution. However, when the divergences of the distribution in domains are huge, these methods cannot obtain and utilize the latent informations effectively. Therefore, it is very significant to study how to train a cross-domain model with strong adaptability by constructing multiple latent spaces.In this thesis, we study the strategy of multi-latent spaces in transfer learning, and propose several transfer learning algorithms based on multi-latent spaces. Our main contributions are as follows.(1) A novel approach of Quadruple Transfer Learning is proposed. It can simultaneously learn four kinds of high-level concepts in shared and non-shared latent spaces. Our work is motivated by the followings. Firstly, since specific latent information can be used to train the learning model, and most specific latent information is obtained by constructing the specific latent spaces, some previous methods based on single shared latent space cannot obtain and utilize the specific latent information effectively. Secondly, since ambiguous concepts can improve the performance for cross-domain learning, these methods miss the ambiguous concepts, thus, they cannot fit different cases. To solve these problems, our method first introduces the ambiguous concept and formalizes four high-level concepts, including the ambiguous concept, and then it constructs a shared latent space and a non-shared latent space. Furthermore, this method learns the high-level concepts in the corresponding shared and non-shared latent spaces simultaneously. In addition, the iterative QTL algorithm with convergence guarantee is presented to solve the optimization problem. Finally, extensive experiments demonstrate that QTL is more effective than the baselines, and meanwhile it can successfully avoid the negative transfer.(2) A novel approach of Multi-Bridge Transfer Learning, which learns multiple shared concepts in different shared latent spaces, is proposed. Since a set of shared latent information in one latent feature space is just a subset of all the shared latent information, some previous methods based on the single shared latent space cannot obtain and utilize the shared latent information effectively. To solve this problem, multi-bridge transfer learning method constructs multiple shared latent spaces on the co-occurrence raw feature space, and then learns the corresponding shared concepts on each shared latent space to satisfy consistent distribution. Additionally, the iterative MBTL algorithm with convergence guarantee is proposed to solve the optimization problem. Extensive experiments demonstrate that MBTL can significantly outperform state-of-the-art learning methods on the topic and sentiment classification tasks.(3) A novel approach of Multi-Layer Transfer Learning is proposed. It simultaneously learns multiple shared and non-shared concepts in different latent spaces. To solve this problem that some previous methods based on the single shared latent space cannot effectively obtain and utilize the shared and non-shared latent information, multi-layer transfer learning method constructs a shared latent space on the co-occurrence raw feature space and a non-shared one in each domain respectively, and these latent spaces are combined into one latent feature space layer. Furthermore, this method constructs multiple layers and satisfies consistent distribution by learning the corresponding distributions on different layers. In addition, the MLTL algorithm with convergence guarantee is proposed to solve the optimization problem. Finally, extensive experiments demonstrate that MLTL can obtain excellent performance on the text classification task across domains.
Keywords/Search Tags:Transfer Learning, Ambiguous Concept, Shared Latent Space, Specific Latent Space, Multi-Latent Spaces, Cross-Domain Classification
PDF Full Text Request
Related items