Font Size: a A A

Research And Implementation Of Online Comment Sentiment Analysis Algorithm Based On Attention Mechanism

Posted on:2021-11-19Degree:MasterType:Thesis
Country:ChinaCandidate:P JiFull Text:PDF
GTID:2518306308969209Subject:Computer Science and Technology
Abstract/Summary:PDF Full Text Request
With the development of the Internet and mobile terminals,more and more people share information on the network and write some of their own reviews,including movie reviews,shopping reviews,news reviews,and so on.Traditional sentiment analysis can only give the emotional tendency of the entire sentence.With the introduction of attention mechanism,deep learning has developed rapidly in the field of natural language processing.Attention mechanism can capture contextual information and can better deal with problems at the semantic level.The task of fine-grained sentiment analysis consists of two processes:extraction of evaluation objects and sentiment judgment of evaluation objects.This article attempts to use a deep neural network model based on the attention mechanism to conduct fine-grained sentiment analysis,and verifies that the use of BERT word vectors has more advantages in sentiment analysis than previous word vectors.The aspect extraction task is a sequence labeling task.After analyzing the traditional RNN-CRF framework principle and specific structure,it improves some of its shortcomings and obtains the model MS-Att-CNNs:combined with a convolutional neural network Attention mechanism replaces LSTM and CRF,and uses multi-semantic embedding technology in the process of word embedding.The Bi-LSTM+ CRF and MS-Att-CNNs models are compared in the notebook and restaurant fields of the International Semantic Evaluation Conference SemEval-2014 Task.Compared with the model Bi-LSTM+ CRF,the The F1 value of the notebook data set is 2.34%higher,and the F1 value of the restaurant data set is 0.96%higher.By analyzing the training process of fine-grained sentiment analysis model based on LSTM and attention algorithm and some shortcomings,a model PW-MHA based on multi-head attention and position weight is proposed.Compared with the previous models,the PW-MHA model has three major changes:the use of multi-head attention mechanism,the use of position weights,and a feature extraction operation before softmax normalization.Compared to the best MGAN in the benchmark model,the ACC and F1 of the PW-MHA model are 2.3%and 3.34%higher in the notebook field and 2.2%and 3.23%higher in the restaurant field,respectively.
Keywords/Search Tags:fine-grained sentiment analysis, attention mechanism, aspect extraction, position weight
PDF Full Text Request
Related items