Font Size: a A A

Aspect-Level Sentiment Analysis Based On Multiple Neural Networks

Posted on:2022-08-30Degree:MasterType:Thesis
Country:ChinaCandidate:J WangFull Text:PDF
GTID:2518306560455024Subject:Software engineering
Abstract/Summary:PDF Full Text Request
With the continuous development of natural language processing technology,people began to use neural networks to build a bridge between people and machines-emotional analysis.Existing methods are no longer sufficient to meet the need for fine classification,and researchers have begun to pursue more fine-grained aspects of emotional analysis,but the accuracy and time cost of the results have been poor.This paper focuses on a series of problems in the analysis of emotions at the level of the relevant research,and the main contents of this paper are as follows:(1)This paper presents a kind of aspect sentiment analysis model based on hierarchical neural network,uses bidirectional LSTM to process text information,and introduces attention mechanism to model the relationship between aspect words and text,so as to obtain the global characteristics of input text,so as to "filter" out the emotional words in the text,and then analyze it.In addition,in order to solve the problem of small amount of data and difficult data set acquisition,we introduced migration learning as a tool,constructed a simple pre-train model to allow the model to learn semantic knowledge on large-scale document data sets,and then introduced the results of the parameters into the formal training model,the experimental results on the four public data sets proved the improvement effect of this method.(2)The existing aspect-level emotion analysis method neglects the relationship between the text and the aspect's own words and words,but in the aspect emotion analysis,the connection between contexts is very important,because the sentence contains the emotional subject and the emotional tendency.In view of this problem,this paper puts forward the multi-attention fusion model(Multiple Attention Fusion,MAF),introduces the self-attention mechanism,and combines the text self-attention,aspect words are combined with the attention of the aspect word from the aspect word relative to the text,in order to obtain the global and local characteristics of the text.Compared with other similar studies,our model has achieved improvement effect on the existing public data sets,and the experiment shows that the multi-attention model presented in this paper is effective.
Keywords/Search Tags:Natural Language Process, Aspect-level sentiment analysis, Attention, Neural networks, Transfer learning
PDF Full Text Request
Related items