Font Size: a A A

Research On Text Classification Method Combining Attention Mechanism And Bi-GRU

Posted on:2021-03-19Degree:MasterType:Thesis
Country:ChinaCandidate:Y Y XiongFull Text:PDF
GTID:2428330629986198Subject:Computer technology
Abstract/Summary:PDF Full Text Request
Text classification has always been an important part of natural language processing.The text classification method is mainly to extract text features from the text and classify them according to the text features.However,feature extraction has always been the difficulty of text classification.Many statistical-based text classification methods are actually this matching method.This method is not only time-consuming and labor-intensive,but also requires preset text features,which is a very high professional ability.Therefore,traditional text classification has always been a way of high consumption and low precision.With the rapid development of deep learning in recent years,deep learning methods have been widely used in the extraction of text features,and proved to be effective in extracting text features.In the deep learning text classification method,the text features are learned by the classification network.Therefore,the choice of the classification network directly affects the classification results.The classification network also needs a lot of time to train when it wants to learn text features better.In order to solve these problems,this thesis proposes a residual bidirectional GRU classification method with attention mechanism to learn the features of the text to improve the accuracy and efficiency of text sentiment classification.The main research contents of this article are as follows:1.A text classification method combining attention mechanism and Bi-GRU is proposed,and experiments have proved that the Bi-GRU is more efficient and stable in terms of automatic feature selection and time series dependency than traditional methods when processing long texts.2.A large number of experiments have been carried out for the method proposed in this paper.Through the analysis of the experimental results,it is proved that the Dilated Convolution can effectively expand the perception field and retain more context information when extracting the feature vectors in this paper.3.Through the analysis of the experimental process,it is proved that the attention mechanism in the text classification network assigns the weight of the text vector keywords,so that the network can better capture keyword information and can classify faster.4.Through the analysis of the experimental process,it is observed that the residual network in the deep neural network can effectively suppress the degradation of the network and avoid the problem of suboptimal solutions in deeper networks.5.Through a large number of experiments,the degree of influence of the preset parameters of the deep neural network on the network performance is summarized,which proves that the choice of parameters has a great influence on the classification efficiency of the network,and the correct selection of the preset parameters is very important for a classification network.In summary,this thesis proposes a method for classifying text based on text sentiment,and conducts a lot of experiments on this method on four public data sets.Experiments show that this method has very good classification effect and stable classification ability on text sentiment classification.
Keywords/Search Tags:Natural language processing, Long-short term memory network, Attention mechanism, Dilated convolution, Rest Net
PDF Full Text Request
Related items