Font Size: a A A

Sentiment Analysis Of Reviews Based On CNN-BiGRU-Attention

Posted on:2023-08-06Degree:MasterType:Thesis
Country:ChinaCandidate:Q N ZhuFull Text:PDF
GTID:2568306842967949Subject:Probability theory and mathematical statistics
Abstract/Summary:PDF Full Text Request
With the update of Internet technology,more and more consumers share and communicate on the Internet,thus generating a lot of comment text data.Mining the emotional attitudes of consumers hidden in massive reviews is of great significance to consumers,enterprises and Internet platforms.In the task of sentiment analysis,traditional methods have problems such as relying on a large number of manual designs and insufficient feature extraction ability,which cannot meet the current requirements.Deep learning-based methods prove to be the state-of-the-art,and their effective fusion is better than a single model.Therefore,this paper proposes two more advantageous fusion models and applies them to the field of long reviews,and the purpose is to elevate the text classification performance.Firstly,in view of the shortcomings of inadequate feature extraction of single model and large amount of parameters of fusion model in long review texts,we propose a fusion model named BiGRU-Multi-Att DSTCN(Bidirectional Gated Recurrent Unit Multi-head Attention Mechanism Depthwise Separable Temporal Convolutional Networks,BiGRUMulti-Att-DSTCN).First,the model uses bidirectional gated recurrent units to extract the global information of the text;Second,the multi-head attention mechanism is integrated into the model to obtain multiple features of different subspaces of the review text;Then,the word vector is passed to the temporal convolution network,which uses the depthwise separable convolution method to cut down the quantity of trainable parameters;Final,the fully extracted features are input into the global equalization pooling layer before classification.Secondly,for the problem of category imbalance and hard samples of long reviews,we adjusted and optimized the structure of the fusion model BiGRU-Multi-Att-DSTCN,and then proposed a new fusion model named BiGRU-Att-HCNN(Bidirectional Gated Recurrent Unit Self-Attention Mechanism Hybrid Convolutional Neural Networks,BiGRU-Att-HCNN),which has less trainable parameters and better classification performance.First,the self-attention mechanism is integrated into the bidirectional gated recurent unit;Second,the standard convolution and the multi-layer atrous convolution are parallelized to extract multi-scale features at the phrase level and sentence level simultaneously,and the convolution method is replaced with a depthwise separable convolution calculated in two steps.Final,the traditional cross-entropy loss causes the loss of hard samples and a large number of samples of a certain class to occupy the main body of the total loss.Therefore,this study uses focal loss,and uses Leaky ReLU activation function to receive the low loss caused by focal loss.Finally,the two models mentioned above are tested on three public datasets IMDB,Yelp2013 and TSB.Compared with 18 benchmark models,our two fusion models BiGRUMulti-Att DSTCN and BiGRU-Att-HCNN have certain advantages under six evaluation indicators,and the latter has better performance.At the same time,we analyze the momentous hyper parameters,and conduct ablation experiments on several important modules of the fusion models.Our two fusion models BiGRU-Multi-Att-DSTCN and BiGRU-Att-HCNN show superior classification performance on long reviews.
Keywords/Search Tags:Sentiment analysis, Neural network, Depthwise separable convolution, Attention mechanism, Focal loss
PDF Full Text Request
Related items