Font Size: a A A

Level And Aspect-term Sentiment Analysis Based On Deep Learning

Posted on:2022-06-24Degree:MasterType:Thesis
Country:ChinaCandidate:S Y YuFull Text:PDF
GTID:2518306605489554Subject:Master of Engineering
Abstract/Summary:PDF Full Text Request
Sentence-level sentiment analysis and aspect word-level sentiment analysis are two basic tasks in text sentiment analysis,which have important practical significance and research value.The goal of sentence-level sentiment analysis is to dig out the sentiment color and tendency contained in a sentence,and the goal of the aspect word sentiment analysis is to unearth the emotional tendency of a text in different aspects.Aiming at the problem of insufficient semantic feature capture in sentence-level sentiment analysis,this thesis conducts in-depth research on semantic feature extraction methods based on previous work,and proposes a sentence-level sentiment analysis model based on ATTBi LSTM-ATTCNN.Aiming at the problem of few data,insufficient sentiment information capture and low classification accuracy in aspect word sentiment analysis,inspired by the pre-training language model,combined with the feature extraction model,a kind of aspect word sentiment analysis model based on ALBERT-Bi LSTMCNN is established.The details of the two models are as follows:(1)A sentence-level sentiment analysis model based on ATTBi LSTM-ATTCNN,alleviating the problem of sentence-level sentiment analysis feature capture supplement.The model first mixes the Attention mechanism and the two-way LSTM network to obtain the ATTBi LSTM network model.The ATTBi LSTM network model adds an Attention layer after the Bi LSTM network to help the model pay more attention to the more important words in the sentence.Then,the Attention mechanism and the CNN network are mixed to obtain the ATTCNN network model.The ATTCNN model changes the traditional pooling in CNN to a pooling based on the Attention mechanism,which the n-gram features generated by convolution are weighted pooled,and the important convolution n-gram semantic information is highlighted.Then the ATTBi LSTM and ATTCNN are combined to input the sentence sequence into the ATTBi LSTM and ATTCNN,and the sentence semantic vectors output by the models are spliced,which are used in the classification layer for emotion polarity classification.In order to verify the performance of the ATTBi LSTM-ATTCNN,this thesis conducts experiments on the electronic product user comment data set,and compares it with other sentence-level sentiment classification models that use different structures and attention mechanisms,which achieve excellent results.The results show that the ATTBi LSTMATTCNN model is effective.The accuracy is 80.85%,and the Macro-F1 is 79.56%,which are higher than other models.Therefore,the results can be shown that ATTBi LSTMATTCNN can effectively perform sentence-level sentiment polarity classification and has good performance.(2)An aspect word-level sentiment polarity classification model based on ALBERTBi LSTMCNN,which alleviates the problems of few data,insufficient sentiment information capture,and low classification accuracy in aspect word-level sentiment polarity classification.First,we use the ALBERT model,and construct the aspect word sentiment polarity classification data into aspect word sentence pairs as the input of ALBERT.In natural language processing tasks,the BERT pre-training language model is widely used because it can use large-scale unlabeled corpus for pre-training to improve performance and alleviate the problem of less data in some tasks.The ALBERT is an improved version of BERT.Compared with the BERT model,it greatly reduces the amount of parameters,memory overhead and improves training speed,and also achieves good classification performance and improves the ability to capture sentiment features.After that,the semantic vectors of aspect word sentence pairs output by the ALBERT model are input to the Bi LSTMCNN for feature extraction.The Bi LSTMCNN has the ability of Bi LSTM to capture global information as well as the ability of CNN to capture local features.It is located between the ALBERT layer and the classification layer.Fully extract the emotional features in the word embedding vector output by ALBERT,and get a more accurate classification result of aspect word sentiment polarity.In order to verify the performance of the ALBERTBi LSTMCNN model,the experiments were conducted on the Restaurants 2014 and Laptop2014 datasets of Sem Eval Task 2014.The accuracy of the ALBERT-Bi LSTMCNN on the Restaurants 2014 dataset is 86.49%,and the accuracy on the Laptop2014 dataset is 79.51%,both higher than other models.At the same time,the parameters of ALBERT and BERT are also compared,and the results show that the parameters of the ALBERT are significantly reduced compared to the BERT model.The experimental results show that ALBERTBi LSTMCNN has achieved excellent performance in the task of word-level sentiment polarity classification.
Keywords/Search Tags:Sentiment Analysis, Sentence Level, Aspect Words, Convolutional Neural Network, Bidirectional Long Short-Term Memory, Attention Mechanism, ALBERT
PDF Full Text Request
Related items