Font Size: a A A

Instance-based And Feature-based Transfer Learning Approaches With Their Applications

Posted on:2014-08-01Degree:DoctorType:Dissertation
Country:ChinaCandidate:S Z YangFull Text:PDF
GTID:1108330479479629Subject:Applied Mathematics
Abstract/Summary:PDF Full Text Request
With the rapid development of science and technology, the real world has witnessed an exponential increase in the availability of data from multiple sources and modalities.This has generated extraordinary advances on how to efficiently and effectively process massive amounts of complex high-dimensional, sparse, containing noises and non i.i.d.data, which are common focus of several researchers from mathematics, computer science, and knowledge engineering. Transfer learning is an effective tool for data processing which can handle this kind of complex massive data. In this thesis, we propose some systemic researches about transfer learning in theory, method and application. More concretely, the main contributions include:1. This thesis proposes a novel transfer learning framework which contains many methods related with transfer learning such as multi-task learning, self-taught learning,domain adaptation, sample selection bias and covariance shift et al. We can develop new transfer learning methods by finding out the relationship of various methods. We also aim to employ traditional machine learning techniques such as dimensionality reduction, semisupervised learning and active learning et al. to help develop new methods. In a word,the general transfer learning framework is the theoretical foundations of the following method and application research.2. According to the graph-based semi-supervised learning framework, we proposes a graph-based model for transfer learning, which enriches the theory and method of instance-based transfer learning framework. We construct a tripartite graph to represent the transfer learning problem and model the relations between the source domain data and the target domain data more efficiently. By learning the informational graph spectra, we can get the new informative feature representation. Then use traditional machine learning model to get a classifier, which can used to predict the labels of target domain data. Because the new feature representation of target domain data contain many information including instance space, feature space and label space, then the knowledge from the source domain data can be transferred to help improve the learning performance on the target domain data. Experiments on some text data sets show the effectiveness of our algorithm.3. We propose a transfer sparse subspace learning method, which enriches the theory and method of feature-based transfer learning framework. First, we present an unsupervised maximum margin feature selection algorithm via sparse constraints. The algorithm combines feature selection and K-means clustering into a coherent framework. L2,1-norm regularization is performed to the transformation matrix to enable feature selection across all data samples.An iterative algorithm with convergence analysis is provided. Then we expand this model to handle transfer learning problems. We propose a general framework for transfer learning, referred to as Transfer Sparse Subspace Learning. This framework is suitable for different assumptions on the divergence measures of the data distributions,such as MMD, Bregman divergence, KL divergence, etc. The sparse regularization can reduce time and space cost obviously, and more importantly, can avoid over-fitting problem. We give different solutions to the problems based on different distribution distance estimation criteria, and convergence analysis is also given. Comprehensive experiments on the text data sets and the face image data sets demonstrate that our methods outperform existing transfer learning methods.4. To deal with some high-dimensional, sparse, noisy and non independent identically distributed data simultaneously, we propose a Robust Non-negative Matrix Factorization via joint Sparse and Graph regularization model for Transfer Learning. First,we employ a Robust Non-negative Matrix Factorization via Sparse regularization model to handle source domain data, and then learn a meaningful matrix which contains much common information between source domain and target domain data. Second, we treat this learned matrix as a bridge and transfer it to target domain. Target domain data are reconstructed by a Robust Non-negative Matrix Factorization via joint Sparse and Graph regularization model, which takes the robust loss function, sparse regularization and local structure of target data into consideration. Third, we employ feature selection technique on new sparse represented target data. Fourth, we provide novel efficient iterative algorithms, and also give rigorous convergence and correctness analysis separately. Experimental results on both text and image data sets demonstrate that our model outperforms existing start-of-art methods.
Keywords/Search Tags:Transfer Learning, Dimensionality Reduction, Semi-Supervised Learning, Subspace Learning, Non-Negative Matrix Factorization
PDF Full Text Request
Related items