Font Size: a A A

Research On Emotion Classification Of Hidden Dangers In Coal Mines Based On Multi-feature LSTM-Self-Attention

Posted on:2021-06-27Degree:MasterType:Thesis
Country:ChinaCandidate:Y R DongFull Text:PDF
GTID:2481306095975859Subject:Software engineering
Abstract/Summary:PDF Full Text Request
In recent years,the situation of coal mine safety production in my country has continued to improve,but it still has a large gap compared with developed countries.In order to continuously optimize the development of coal mine production in China,the country has issued a series of related safety production policies and issued a unified coal industry safety production hidden danger information standard.A series of new problems have been raised.On the one hand,how to analyze these massive data efficiently and accurately,and the text-based data in these massive data contains rich content and has not been effectively mined.On the other hand,the existing methods are all The use of manual methods to classify the potential safety hazards in coal mines has resulted in less scientific and rigorous classification results.In response to the above problems,this article did the following research:This paper focuses on text-based information on coal mine safety hazards,conducts sentiment analysis on it,and builds two text sentiment classifications based on multi-feature LSTM-Self-Attention and BERT-based LSTM-Self-Attention based on deep learning methods.The model realizes the task of automatic classification of information about hidden dangers in coal mine safety.In the multi-feature LSTM-Self-Attention text sentiment classification model,first of all,a proprietary thesaurus of coal mine safety hazard information is artificially constructed,and then the text preprocessing method of traditional text sentiment classification is used,and it is added to the traditional input features After learning the part-of-speech features,the Word2vec model is used to train the computer-recognizable vector information.Finally,the LSTM network model is used to extract the sequence features of the text,and the self-attention mechanism(Self-Attention)is introduced into the model to extract the sequence features The grammatical and semantic features of the sentence reduce the complexity of the task.In the BERTTM-based LSTM-Self-Attention text sentiment classification model,first,the text is directly trained into vector information using the BERT model.This model solves the problem of polysemy and comprehensively represents the features of the text.Then use the LSTM model to extract features,and add Self-Attention to further extract long-distance features,and finally use a classifier for classification to achieve the text sentiment classification task.The two models proposed in this paper effectively avoid the problems of gradient disappearance and gradient explosion in traditional recurrent neural networks,greatly shorten the distance between long-distance dependent features of words,and improve the classification effect.The results show that the feature extraction ability of this method is stronger,which improves the accuracy of sentiment classification.
Keywords/Search Tags:Coal mine safety hazard information, Sentiment classification, LSTM, Self-attention mechanism, Part of speech vector
PDF Full Text Request
Related items