Font Size: a A A

Aspect Based Sentiment Analysis Based On GPT And Attention

Posted on:2022-11-19Degree:MasterType:Thesis
Country:ChinaCandidate:C C WangFull Text:PDF
GTID:2518306761459734Subject:Automation Technology
Abstract/Summary:PDF Full Text Request
With the spread of the internet and the use of smart devices,there is a vast amount of textual evaluation of various matters on the internet,and as time goes on,the number of texts exponentially increases.These texts have a deeper meaning,meaning that they contain the publisher's true opinion and evaluation of things and objects,and as the number of texts increases,the overall sentiment of these texts becomes more real.Based on this reality,the analysis of these texts is an extremely valuable exercise.This is because it is an important aid for businesses to develop the best marketing strategy for their customers,for users to decide whether or not to improve their products based on their sentiment,and for government departments to monitor public opinion and prevent it from exploding.This is why so many talented people have joined the field in recent years,and have achieved so much.As the demand for realistic accuracy increases,aspect-level sentiment analysis has also emerged.This paper proposes an aspect-level sentiment analysis model based on GPT and bidirectional GRU networks.The word embedding method of text has a significant impact on the effect of the deep learning model.In order to increase the positive impact of the word embedding method on the model effect,a large number of word embedding methods have appeared in recent years.Among the more commonly used word embedding methods at this stage,fixed representations based on word vectors such as Word2 vec and Glove cannot solve the problem of polysemy.In order to achieve better results,the dynamic representation method based on word vectors is selected as word embedding in this paper.method.As a dynamic representation method of word vector,the GPT pre-training model based on Transformer can solve the problem of polysemy and can better extract the contextual semantic features of the text.In order to make the word vector have richer semantics,the method of GPT plus aspect encoding is adopted in this paper.After obtaining the word vector,we input it into a multi-layer bidirectional GRU network with attention mechanism.Compared with the one-way neural network that outputs from the front to the back,the two-way GRU network can establish the connection between the output of the current moment and the previous moment and the next moment,so that the deep features of the text can be obtained,and further important information can be obtained.more prominent.Finally,in the output layer,the Softmax function will be used for classification,and the final result of sentiment classification will be obtained.The datasets we use in this paper are three classic datasets in aspect-level sentiment classification,namely Restaurant,Laptop and Twitter datasets.These three datasets are the standard datasets used for aspect-level sentiment analysis and are well suited to our current experimental needs.In this paper,we first compare several word embedding models on the Laptop dataset to highlight the superiority of GPT.These three datasets were then tested on the new aspect-level sentiment analysis model in this paper,and the experimental results showed varying degrees of improvement compared to the other models.
Keywords/Search Tags:Deep learning, Word embedding, Attention mechanism, GPT
PDF Full Text Request
Related items