Font Size: a A A

Aspect-level Sentiment Analysis Based On Syntactic Information Of Interactive Attention

Posted on:2022-12-03Degree:MasterType:Thesis
Country:ChinaCandidate:W R WangFull Text:PDF
GTID:2505306767498504Subject:Enterprise Economy
Abstract/Summary:PDF Full Text Request
With the rapid popularization of Internet in all ages,more and more people join the social platform and electric business shopping,people in the network of communication left a lot of views and comments at the same time,the network text with subjective emotional attitude to consumers can be used as the basis for the consumer decision-making,so text sentiment analysis arises at the historic moment,Emotional level analysis to determine evaluation entity of the emotional polarity as the main task,according to these emotional polarity judgment’s satisfaction with the products,so as to provide reference value for other potential customers,a large amount of data on network provides a great incentive for the application of deep learning,as well as emotional polarity judgment opened up a new way of thinking.The existing research approach uses attention mechanism in the aspect-level sentiment classification process,which assigns weights to text features to extract semantic-level information and ignores the use of syntactic information,resulting in the inability to make good use of the opinions of different aspect words for sentiment classification,and previous studies do not model context and aspect entities separately,and when there are multiple aspect entities in the text,the different aspect entities cannot be well matched with the corresponding sentiment polarity.Therefore,to address the above problems,this paper proposes the AspectLevel Sentiment Analysis Based on Syntactic Information of Interactive Attention(hereinafter referred to as SICA model).The model mainly consists of two major parts: one part is the extraction of text contextual features;one part is the extraction based on aspect-specific word features;finally,the extracted contextual text features and aspect-specific word features are stitched together and input into the interactive attention layer for interaction,and the sentiment classification results are obtained.Contextual text feature extraction part mainly includes word embedding layer,grammatical information extraction layer,convolutional layer and two-way long and short-term memory neural network layer,this paper adopts BERT pre-trained language model as the word vector model of contextual information and aspect words,grammatical information layer is the middle and low layers of BERT model as the main information source,and the dependent syntactic tree with dependencies as the auxiliary information,which constitutes the grammatical information layer,and the convolutional layer is used to obtain global semantic information.In sentiment analysis tasks,especially in the Chinese domain,where multiple meanings of words are likely to occur,the bidirectional long-and shortterm memory neural network contains rich semantic information and can effectively capture multiple meanings of words,which cannot be handled by the unidirectional long-and short-term memory neural network.The feature extraction part of specific aspect words,using the dependent syntax tree as an aid to extract aspect words,different aspects of aspect words cannot be well utilized,and the features are extracted by convolutional neural network.The interaction attention module sets the threshold value at the same time,and the interaction weight is too small for secondary interaction to be able to fully utilize the text information,and finally,the results are output through the emotion output layer,which is used to judge the emotional polarity of the text.The main innovations of the SICA model proposed in this paper are as follows:Firstly,the syntax information layer of the BERT model is used as the extraction layer of the whole syntax information by using the middle and lower level phrase structure of the BERT model and the dependency syntaxtree which indicates the dependency relationship.Second,previous studies did not model aspect words and context information separately.In this paper,features based on specific aspect word sequences will be extracted with dependency syntax tree as auxiliary information.Third,set the interaction mechanism,based on the specific aspect word sequence and context sequence interaction,set the scoring function,secondary calculation of interaction weight,comprehensive extraction of relevant word information.The SICA model proposed in this paper was verified experimentally in three public data sets,Restaurant,Laptop and Twitter.The effectiveness of the proposed SICA model was proved by comparing the performance of different models and the accuracy and F1 values of the model in the above three data sets.It is proved that syntactic information contributes significantly to improving the performance of the model.
Keywords/Search Tags:BERT pre-trained language model, Grammatical information, Aspectual word extraction, Interactive attention
PDF Full Text Request
Related items