Font Size: a A A

Research On Transfer Learning For Multi-instance Classification

Posted on:2018-11-29Degree:DoctorType:Dissertation
Country:ChinaCandidate:Y H XuFull Text:PDF
GTID:1318330533967122Subject:Computer Science and Technology
Abstract/Summary:PDF Full Text Request
With the rapid development of computer information technology,the ability to store and collect data is rising.Using computer to analysis and process data is more and more urgent.This makes the importance of machine learning more and more obvious.Traditional machine learning methods rely on the basic assumption that training and testing data generation mechanisms do not change with the environment.However,in many real application,the above assumption is usually too rigid and difficult to set up.Transfer learning relaxes this assumption and allows the training and test data to obey different distributions.Thus,transfer learning can excavate invariant essential features and structures between two different but interrelated domains,which make label data and other supervised information to be transferred and reused between domains.Strengthen the study of transfer learning is of great significance to improve the efficiency of machine learning,improve the performance of existing algorithms and enhance the practicability of algorithms.Based on the existing research of transfer learning,this paper studies the transfer learning problem of multi-instance classification,and also tries to utilize metric learning to improve the performance of the algorithm.The main work of this paper includes the following four aspects:Firstly,many previous transfer learning studies have been focused on designing and optimizing objective functions with the Euclidean distance to measure dissimilarity between instances.However,in some real-world applications,the Euclidean distance may be inappropriate to capture the intrinsic similarity or dissimilarity between instances.To deal with this issue,in this paper,we propose a metric transfer learning framework(MTLF)to encode metric learning in transfer learning.In MTLF,instance weights are learned and exploited to bridge the distributions of different domains,while Mahalanobis distance is learned simultaneously to maximize the intra-class distances and minimize the inter-class distances for the target domain.Unlike previous work where instance weights and Mahalanobis distance are trained in a pipelined framework that potentially leads to error propagation across different components,MTLF attempts to learn instance weights and a Mahalanobis distance in a parallel framework to make knowledge transfer across domains more effective.Secondly,in the case of multi-instance classification,this paper utilizes the metric learning into multi-instance classification problem,and proposes a multi-instance learning framework.By binding the distance between bags and the distance between label vectors,the framework can effectively preserve and utilize the inherent geometric information of the feature space and label space.By this way,we can improve the performance of the proposed multi-instance classification algorithm which lay the foundation for multi-instance transfer learning.Thirdly,in order to solve the problem of multi-instance classification where training and test bags are drawn from different distributions,respectively,this paper analyzed the error of multi-instance metric learning method under traditional learning setting and transfer learning setting,respectively.Based on the analysis,this paper proposed a multi-instance metric transfer learning method(MIMTL).This method balances the distribution of the source domain and target domain by adding weights to the bags in the source domain.Then,we use the reassigned bag data to construct a multi-instance metric transfer learning model which solves the problem of multi-instance classification under the inconsistency of training and test bag distributions,and use metric learning to improve the performance of the algorithm.Fourthly,considering the fact that MIMTL includes too many adjustable parameters and too restrictive constraints,this paper adopts new learning principles: maximize the discrimination probability between the bags from the same class,minimizing the discrimination probability between bags from different classes,and substituting the learning principles based on distance constraints.In this paper,the objective function of the original model is improved by using the new learning principle,which avoids too many tunable parameters and too tightly constrains.And on this basis,a multi-instance transfer learning method by consistent maximum likelihood estimation is proposed.So as to better solve the multi-instance transfer learning problem.
Keywords/Search Tags:Transfer Learning, Multi-Instance Classification, Metric Learning, Mahalanobis Diatance, Learning Framework
PDF Full Text Request
Related items