Font Size: a A A

Deep Transfer Learning For Cross-domain Aspect-based Sentiment Analysis

Posted on:2021-05-19Degree:MasterType:Thesis
Country:ChinaCandidate:Z X CaoFull Text:PDF
GTID:2428330626959679Subject:Management Science and Engineering
Abstract/Summary:PDF Full Text Request
With the rapid development of the mobile Internet,more and more netizens express their views on social platforms,express their opinions on news platform and comment on products on e-commerce platforms,resulting in a large number of text data with sentiment information,which is of great significance for sentiment analysis.Sentiment analysis can be divided into three categories in terms of granularity: document-level,sentence-level and aspect-level.The document-level and the sentence-level sentiment analysis are used to classify the sentiment polarity of the whole document or sentence,which belongs to coarse-grained sentiment analysis,while the aspect-based sentiment analysis(ABSA)aims to determine the sentiment polarity of a given aspect in the sentence.However,in the application scenarios of sentiment analysis,there is often no labeled training data in new domain,and manual labeling data is time-consuming.Cross-domain sentiment analysis is a hot research topic in recent years.This paper solves the task of fine-grained sentiment analysis,and solves the problem of insufficient labeled data in the target domain through deep transfer learning.The main research work of this paper is as follows:(1)This paper first analyzes the shortcomings of the existing methods of ABSA,then improves the existing methods,and proposes a model combines multiple word representation methods and multiple attention mechanisms.Considering that existing research methods often use static word vectors,this kind of representation method cannot solve the problem of polysemy.In this paper,Our model encodes words of input as a combination of three different types of word representation methods(word-level embedding,contextual-level word embedding and character-level word embedding).In addition,considering the relationship between the target word and the comment text,this paper also use attention mechanisms(including self-attention mechanism and co-attention mechanism)to enhance the representation of the context.In the output layer,considering the position of the aspect in the comment is often close to sentiment words,this paper integrate the position information into the model.Experiments show that the proposed method can achieve more accurately than the benchmark method.(2)In view of the phenomenon that there is no labeled data in the target domain,this paper proposes a framework based on the fusion of deep transfer learning and semi-supervised learning methods to solve cross-domain problem,using the labeled data in the source field to learn shared knowledge between the fields.In this paper,the pre-trained BERT model is used to encode the context,and the KL divergence method is used as measure to solve the feature adaptation problem between domains.Most of the existing cross-domain methods do not use the unlabeled data in the target domain.In this paper,the semi-supervised learning method is used to use the unlabeled data to enhance the model's ability to generalize unknown data.Specifically,Entropy Minimization is used to constrain the model and encourage the model to output high-confidence prediction results on unlabeled data.Consistency regularization and back-translate strategies are used to make the model insensitive to noise.Experiments show that the transfer method proposed in this paper is effective to solve cross-domain problems and when the semi-supervised method is integrated into the model,it can greatly improve the prediction results in the target domain.
Keywords/Search Tags:Sentiment analysis, Deep learning, Transfer learning, Semi-supervised learning
PDF Full Text Request
Related items