Font Size: a A A

Sentiment Classification Study Based On Multi-feature Fusion And Attention Mechanism

Posted on:2022-05-03Degree:MasterType:Thesis
Country:ChinaCandidate:X Y XuFull Text:PDF
GTID:2518306608976239Subject:Computer technology
Abstract/Summary:PDF Full Text Request
With the rapid development of social networks and e-commerce,online shopping has gradually become an indispensable part of people's daily lives,and a large number of product review texts have appeared in a spurt.Using sentiment analysis technology to excavate these review texts,by analyzing the review texts of consumers on the purchased products,we can understand the pros and cons of the products and play a certain reference role for potential consumers.However,traditional deep learning methods have problems such as single word vector feature input,inability to distinguish synonyms,and ignoring the location information and frequency information of feature items.In response to these problems,this paper proposes an emotion classification model based on attention mechanism and multi-feature fusion.The commonly used model CNN-BiLSTM for sentiment classification is used as the benchmark,and the benchmark model is improved to make it perform higher in sentiment classification tasks.The main research content of this paper includes the following aspects:(1)A convolutional bidirectional long and short-term memory network sentiment classification model(SF-CNN-BiLSTM)fused with sentiment features is proposed to solve the problem of different meanings of the same word.Based on the problem of CNNBiLSTM sentiment classification model,this paper has carried out research and proposed improvements,and constructed a convolutional bidirectional long-term short-term memory(SF-CNN-BiLSTM)network model fused with sentiment features.This model combines words and parts of speech,Word emotion features are vectorized and spliced as the input layer of the model to expand the amount of information contained in the word vector,use multi-feature fusion to solve the problem of different meanings of the same word and express multi-emotion feature information.Then,the Chunk-Max Pooling method is used instead of the maximum pooling method,considering the sentiment orientation judgment before and after the transition sentence,the syntactic structure position feature is integrated,the traditional neural network model structure is improved,and the accuracy of the entire sentiment classification experiment result is improved.The experimental results show that the SF-CNN-BiLSTM model has a better effect on text sentiment classification than other traditional network models.(2)A sentiment classification model based on attention mechanism and multi-feature fusion(SF-CNN-BiLSTM-ATT)is proposed to solve the influence of contextual redundant information on text sentiment classification.By adding the attention mechanism to the training of the neural network,the weighted semantic representation of the feature vector is generated,the relationship between the current keyword and its related context information is enhanced,and the focus on the keywords with important influence on the sentiment classification task can be more accurately analyzed.Emotional inclinations.Compared with the SF-CNN-BiLSTM model,the accuracy,recall and F1 value of the SF-CNN-BiLSTM-ATT model are all improved.The experimental results show that the SF-CNN-BiLSTM-ATT model proposed in this paper can effectively improve the accuracy of sentiment classification.This paper innovatively uses multi-feature word vectors as the input layer to enrich the semantic features of the model input,uses segmentation pooling to consider the judgment of emotional orientation before and after the transition sentence,and finally considers the different degree of influence of each word in the text data to introduce attention Mechanism,use the attention mechanism to enhance the weight of keywords,thereby improving the accuracy of model classification.Figure 21 Table 8 Reference 66...
Keywords/Search Tags:sentiment classification, CNN, Bi-LSTM, attention mechanism
PDF Full Text Request
Related items