Font Size: a A A

Research On Aspective-level Sentiment Analysis That Merge Position Information And Attention Mechanism

Posted on:2022-10-08Degree:MasterType:Thesis
Country:ChinaCandidate:J H ZhangFull Text:PDF
GTID:2518306332965369Subject:Software engineering
Abstract/Summary:PDF Full Text Request
With the rapid development of information technology and the arrival of 5G era,a large number of users comment on certain events or products on platforms such as Weibo,Twitter and Taobao every day.However,the object of traditional sentiment analysis methods is usually the whole article,paragraph or sentence.In today's information diversity,this kind of method can no longer meet the actual needs of emotion analysis of specific things.Aspect-level Sentiment Analysis can realize the judgment of emotional polarity in different aspects of the review text,and it can provide a comprehensive and scientific decision-making basis for the government,enterprises and consumers,so it has received widespread attention.Aspect level sentiment analysis methods are usually based on deep learning models,especially cyclic neural networks,because such deep learning models can more richly express textual information.But there are still some problems with the existing methods.First,in the context coding stage,the position relation information of context relative to aspect words is not fully utilized,which may lead to insufficient semantic representation of the context.Secondly,in the text representation of aspect words,the average pooling method ignores the internal relations among each word in aspect words,which may lead to semantic loss or semantic errors.Thirdly,the interrelationship between context and aspect words is ignored.Although some works take such interrelationship into consideration,they use the holistic coding method to mine such relationship,which will cause information loss to a certain extent.Aiming at these three problems,this paper proposes a Sentiment analysis model combining location information and attention network(PAN).Three modules are designed in the PAN model to solve these three problems respectively.The work of this paper mainly includes the following aspects:1.Design the position weight module in the PAN model to enrich the text representation of context.By embedding the position weight module in the hidden layer information of the context,the words that are relatively close to the distance are assigned more weight,and the words that are relatively far away are assigned less weight.2.In the PAN model,the self-attention mechanism module is used to encode aspect words separately so that the expression of aspect words can highlight the meaning of central words.When the number of words in aspect words is greater than 1,we calculate the weight of each word through the selfattention mechanism to update the representation of aspect words.3.A Bi-Hierarchical Attention Network(BHAN)module is proposed in the PAN model,which makes full use of the relationship between aspect words and context.By dividing the context into the left and right parts of aspect words,the attention matrix is used to update the text representation of the left and right sides of the context and aspect words from a more fine-grained perspective,so as to provide more effective information for the task of sentiment classification.4.This paper compares the proposed PAN model with several existing sentiment analysis methods in public datasets Semeval-2014 Task 4 and ACL14 Twitter.Experimental results show that the proposed PAN model achieves higher accuracy.At the same time,we also set up an ablation experiment to verify the effectiveness of the position-weight module,self-attention mechanism module and bilateral attention network module in the PAN model.
Keywords/Search Tags:Fine-grained sentiment analysis, deep learning, attention mechanisms, cyclic neural networks
PDF Full Text Request
Related items