Font Size: a A A

Aspect-Level Sentiment Analysis Based On Graph Attention Network

Posted on:2024-05-04Degree:MasterType:Thesis
Country:ChinaCandidate:Y LiuFull Text:PDF
GTID:2568307064497094Subject:Engineering
Abstract/Summary:PDF Full Text Request
With the development of 5G and gigabit broadband,people are increasingly inclined to access information and entertainment through services such as online shopping,live streaming,and social networking,and these services generate an increasing amount of data containing a large amount of sentiment information,which is important for merchants and enterprises to provide services that allow them to better understand consumers’ needs and feedback in order to provide better services and products.However,traditional sentiment analysis methods can only determine emotional polarity by recognizing emotional discourse but cannot accurately identify detailed emotional expressions.In recent years,the convergence of linguistics and machine learning technologies has provided researchers with more effective and accurate methods to analyze the potential sentiment polarity in texts by combining syntactic dependencies explored in linguistics and deep learning algorithms to achieve fine-grained sentiment category differentiation.In this paper,we investigate how to use good linguistic tools effectively and accurately in the domain as well as deep learning methods to analyze and categorize extreme sentiment information in texts using deep analytic categorization.The work done in this paper is as follows:(1)For the existing aspect-level sentiment analysis,most of the current methods are clustered by text,which ignores the syntactic relationships that exist between the texts themselves,resulting in slow running models and low classification accuracy.To this end,we propose a BERT-based Fusion Dependence Graph Attention model(FDGAT),based on the pre-training model BERT,which forms an aspect-centric tree by reconstructing the sentence dependency tree,making full use of the dependency relations and lexical information in the tree.The tree is then pre-trained with BERT to embed all the textual information to obtain the mathematical expressions,and the downstream classification task is used for supervised learning in the fine-tuning phase.Finally,the dependency features,lexical features and textual features are fused together in the form of an attention mechanism,which makes the model focus more on the information related to the aspect words and achieves sentiment classification.(2)To address the problem that existing graph attention networks cannot effectively focus on the important parts of sentences when using static attention and also do not fully exploit the information of dependency trees,this paper proposes a Ro BERTa-based Fusion Dependence Dynamic Graph Attention network model(FDDGAT),which applies dynamic attention to our model from the static attention of existing GAT,so that our model can focus on the important parts of the text in order from top to bottom,and it can make the model better understand the relationships and features within the sentences,and improve the ability of processing important text.We also explore the path relationships between nodes in the tree and embed them as features in our model.The experimental results from four public datasets show that the proposed model has higher accuracy and performance in classification than the baseline model,which objectively validates the effectiveness of our model.
Keywords/Search Tags:Aspect sentiment analysis, Attention mechanism, Graph neural network, Pretrained model, Dependency syntactic parsing
PDF Full Text Request
Related items