Font Size: a A A

A Research Of Aspect-level Sentiment Analysis Based On The BERT Model

Posted on:2024-09-27Degree:MasterType:Thesis
Country:ChinaCandidate:Y N ChenFull Text:PDF
GTID:2568306920963279Subject:Computer technology
Abstract/Summary:PDF Full Text Request
Sentiment analysis is an important branch of natural language processing and plays an important role in guiding production,marketing and grasping social opinion.Traditional coarse-grained sentiment analysis no longer meets the needs of today’s production,giving rise to fine-grained aspect-level sentiment analysis.To address the problem of low accuracy of aspect-level sentiment analysis in short texts,the dissertation improves the pre-trained BERT model in terms of the factors affecting the performance of the aspect-level sentiment analysis task.Based on this,the RGAT-CLBERT model for aspect-level sentiment analysis was constructed by fusing a relational graph attention network with an improved BERT model,which effectively improves the performance of the aspect-level sentiment analysis task.The main work of this dissertation includes:(1)Analysis of word vector representation anisotropy problems in pre-trained BERT models.Starting from the source attention mechanism of the pre-trained BERT model and exploring its individual component functions step by step,we conclude that the model suffers from word vector representation anisotropy,i.e.a state where the word embedding matrix is unevenly distributed in the vector representation space.A specific analysis of the causes of the problems studied is presented at the geometrical level,and the relevant theoretical support is given to pave the way for the subsequent expansion of the model’s application scenarios.(2)The CLBERT model with improved BERT was obtained in combination with a comparative learning algorithm.The anisotropy problem of the pre-trained BERT model leads to inadequate extraction of contextual semantic information by the model,which affects the performance of the aspect-level sentiment analysis task.To this end,the CLBERT model was obtained by introducing a contrast learning algorithm to optimise the word vector representation of the pre-trained BERT model.Test results show that the model has improved accuracy and F1(F-score)values in various degrees compared to the BERTbasemodel in a benchmark dataset for multi-task general language understanding evaluation,This includes a1%improvement in accuracy on the SST-2 dataset and a 3.4%improvement in F1 values on the MRPC dataset.(3)An RGAT-CLBERT model for aspect-level sentiment analysis that fuses a relational graph attention networks with CLBERT was constructed.Since existing aspect-level sentiment analysis models ignore sentential expressions where there are multiple aspect words in a single sentence,such sentential expressions are incorrectly classified using traditional aspect-level sentiment analysis models by focusing on the sentiment word closest to the aspect word.At the same time,the pre-trained BERT model does not sufficiently extract contextual semantic information from the utterance,which together affects the judgement of affective tendencies to some extent.To this end,the RGAT-CLBERT model for aspect-level sentiment analysis was constructed using the interplay of syntactic structure information and contextual semantic information.The constructed model was experimentally validated on three publicly available datasets(Twitter dataset,Rest14 dataset,Laptop14 dataset).The results show that the RGAT-CLBERT model has improved accuracy and F1 values in aspect-level sentiment analysis,The F1 values on the Rest14 and Laptop14 datasets improved by 1.9%and 1.6%respectively.The RGAT-CLBERT model constructed in this dissertation improves the performance of aspect-level sentiment analysis tasks to a certain extent and provides a reference for the development of natural language processing,which has broad application prospects.
Keywords/Search Tags:BERT model, Aspect-level sentiment analysis, Contrastive Learning algorithms, Dependency Resolution Tree, Relational Graph Attention Networks
PDF Full Text Request
Related items