Font Size: a A A

Classifier Design Based On Feature Transfer And Model Transfer

Posted on:2018-06-04Degree:DoctorType:Dissertation
Country:ChinaCandidate:S F ZangFull Text:PDF
GTID:1318330539475076Subject:Control theory and control engineering
Abstract/Summary:PDF Full Text Request
In the internet age,the information automatic classification technology has become an important tool for human to obtain valuable information.It is also the key problem of machine learning,pattern recognition and data mining.How to use machine learning algorithm to build a good classifier has become the focus of domestic and foreign scholars.However,the traditional machine learning algorithm usually requires that the training data follows the same distribution as the test data,which is usually difficult to satisfy in practical applications.Transfer learning broke such shortcomings and has become a new research framework in the field of machine learning.The learning efficiency of target tasks can be improved by helping the target task to obtain a good classifier using information from source domain that is different from but related to target domain.Transfer learning enriches machine learning theories,expands application scope,and gradually becomes a new research hotspot.In this dissertation,we design a novel classifier for this purpose and study the transfer learning as the research object.From the perspective of constructing the feature and model transfer classifier,we analyze cross-domain distribution discrepancy,semi-supervised transfer,flexible projection constraints,neural network model transfer and holistic optimization of knowledge transfer and classifier.7 kinds of transfer classifier are proposed.The main research contents are as follows:1.Feature transfer classifiers based on distribution discrepancy between domains are studied.First,aiming at the problem that the difference of the contribution of a single sample to the global is neglected in the traditional distribution discrepancy between domains in transfer learning,Joint Distribution Adaptation based on Maximal Distribution Weight Mean Discrepancy and Weight Transfer Component Analysis based on Sample Local Discriminant Weight are proposed.They corresponding distribution weight for all the samples of source and target domains is designed by using the whitening cosine similarity measure and Local Discriminant Circle respectively.This weigh is then introduced to Maximum Mean Discrepancy to reflect the differences of individuals in the sample.On the basis of this discrepancy,knowledge transfer is realized by utilizing Joint Distribution Adaptation(JDA)to minimize marginal and conditional distribution discrepancy.In addition,the Linear Discriminant Analysis is introduced into the objective function of Weight Transfer Component Analysis based on Sample Local Discriminant Weight to realize knowledge transfer and improve class separability.Second,for traditional cross-domain distribution discrepancy of a large memory consumption and difference neglegence between samples,a novel approach Cross-domain Mean Approximation Discrepancy(CMAD)is proposed to measure discrepancy between source and target domains via computing the sum of squares of distances between samples from source domain(and target domain)and the average from target domain(and source domain).A feature transfer method Cross-domain Mean Joint Approximation Embedding(CMJAE)is presented.The objective function of CMJAE is Cross-Domain Mean Approximation Discrepancy with subspace learning to minimize the distribution discrepancy between domains.Both marginal and conditional distribution discrepancies are minimized and knowledge is transferred between domains during an iterative pseudo label refinement procedure.Finally,the above feature transfer algorithms are combined with the base classifier for the classification experiment to verify the effectiveness and superiority of the algorithms.2.Feature transfer classifiers based on semi-supervised learning and flexible projection constraint are presented.First,a feature transfer algorithm Semi-supervised Transfer Discriminant Analysis based on Cross-domain Mean Constraint(STDA-CMC)is proposed to address the problem in which the original structure and label information of the sample are insufficiently exploited in traditional feature transfer learning.This algorithm combines Semi-supervised Discriminant Analysis with Joint Distribution Adaptation and introduces the Cross-domain Mean Constraint mechanism to realize the knowledge transfer and make full use of the original structure and label information of the sample to improve the classification performance.Second,a feature transfer method Semi-supervised Flexible Joint Distribution Adaptation(SFJDA)is proposed to solve the problem that the traditional feature transfer learning is too rigid in the subspace transformation process and use inadequately the domain structural information and label information of the sample.This method introduces the flexible projection constraint into the Joint Distribution Adaptation and improves the performance of the shared feature subspace.Meanwhile,combined with Manifold Alignment and Linear Discriminant Analysis,the original structure and label information of data are used in the process of knowledge transfer to improve classification accuracy.Finally,the above feature transfer algorithm is combined with the k nearest neighbor classifier for experiment to verify the effectiveness and superiority of the algorithms.3.The model transfer classifiers based on Extreme Learning Machine and Softmax Regression are studied respectively.As a single-layer feedforward neural network,Extreme Learning Machine(ELM)has been proved to be a very efficient and useful learning mechanism.However,when the number of training samples becomes scarce,the performance of ELM is reduced.In order to improve ELM performance in transfer learning,a classifier Transfer Extreme Learning Machine based on Output Weight Alignment with knowledge transfer capability is designed by aligning ELM output weight matrix from the source and target domain to reduce the distribution discrepancy.At the same time,the approximation between the ELM output weight matrices from source and target domain are introduced to the objective function to realize the cross-domain knowledge transfer.Finally,the objective function is transformed into the least squares problem to solve and classify.In order to realize the unified optimization of knowledge transfer process and classifier training process,we design a classifier with knowledge transfer ability: Transfer Softmax Regression.By introducing Joint Distribution Adaptation mechanism into the Softmax Regression objective function,a new classifier with knowledge transfer ability is constructed.Then,the new objective function is solved by the gradient descent method to realize the unified optimization of the classification and the knowledge transfer process.Finally,the validity and superiority of the classification model is verified by classification experiments.
Keywords/Search Tags:Feature transfer, Model transfer, Transfer learning, Classifier, Domain, Distribution discrepancy
PDF Full Text Request
Related items