With the popularity of online shopping and online reviews,people increasingly rely on online reviews of related products when they consume,so it is important to perform sentiment analysis and opinion mining on online users’ reviews.People’s evaluation of entities often involves multiple aspects of the entity,so there are multiple sentiments about multiple aspects of the entity.For example,"The food is so good and so popular that waiting can really be a nightmare",the sentiment of "food" is positive,while the sentiment of "waiting" is negative.Therefore,it is necessary to analyze the sentiment for different aspects of the entity.In this paper,by comparing recent research work,two different aspect-level sentiment analysis models are proposed to address the problems of interference words affecting prediction results and inadequate treatment of word dependency weights.(1)In the comment corpus,some interfering words can be mistaken as opinion words and there is a certain distance between irrelevant words and central words.This paper therefore proposes a solution: first,an iterative context learning algorithm is designed in this paper.A context attention layer is proposed,in which a context dynamic mask is used to mask the words that are far from the central word,and a context dynamic weight reduces the weights of the more distant words.The two modules are executed alternately in the context attention layer to reduce the influence of the more distant irrelevant words on the sentiment polarity judgment of the central word.The context attention layer is learned in several iterations to enhance the feature extraction of the aspectual words in the context part.Finally,the obtained feature values are connected to the global sentence sequences for sentiment classification.A series of experimental comparisons are conducted on the benchmark dataset,and the comparison with other models concludes that the method proposed in this paper is the most effective.(2)Context-based modeling algorithms tend to ignore the sentiment dependencies of context words and aspect words.And the most advanced method in dealing with word dependencies is to use graph convolutional neural networks,but using graph convolutional networks can only assign the same weights to the edges between words.While graph attention networks can assign different weights to different edges according to the importance of words.To address the problems raised above,this paper proposes a graph attention fusion network: in order to establish word dependencies,a syntactic graph attention module with rich syntactic knowledge is designed in this paper.A semantic graph attention module with self-attentive mechanism is designed to extract the semantic associations of words.The syntactic and grammatical features are exchanged through the Transitional graph attention module,and finally graph attention fusion is performed to extract the common information of sentence syntax and semantics.Convolutional layer is also added to increase the learning ability of n-gram syntactic features.Finally,the fused graph attention features are merged with the features of the convolutional layer for sentiment prediction,and using the graph attention network to parse the syntactic and semantic features of the sentences can lead to improved prediction accuracy.The experimental results on three benchmark datasets prove that the model in this paper is the most effective. |