| In recent years,with the continuous development of artificial intelligence technology,natural language processing is also making continuous progress.As a branch of natural language processing,text classification is widely used.With the increasing requirements for the accuracy of text data classification,the monotony of the results of traditional single label classification methods can no longer meet people’s needs,and all labels in multi label text classification methods have gradually become a hot spot of research due to their non-exclusive advantages.Among training models in the field of multi label text classification,the bidirectional Transformer architecture of the BERT model can capture the semantic relationships of text.This model dynamically adjusts text classification tasks in different scenarios to handle multi label text classification issues.However,such an architecture also has obvious drawbacks.It ignores the semantic information of the label itself and the dependency relationship between the labels,resulting in the problem of omission and semantic duplication of the labels.Based on this,this paper proposed the method of BERT combined with label attention mechanism to classify the text.The main contents include:(1)To address the issue of contextual text semantic understanding,this paper proposed a method of combining BERT with multi label semantic understanding.This method improves the way BERT substructure is used to better extract text features,while improving downstream deep learning models to capture label relevance.Specifically,the method implements three models by combining BERT with different downstream tasks,improving the feature representation of text,extracting local representation of text,and extracting label correlation,respectively.Experiments have verified the effectiveness of the three models contextual text comprehension,feature extraction,and tag temporal prediction,respectively,demonstrating the advantages of combining label dependency relationships with multi label semantic understanding.(2)To address the issue of the accuracy of label semantic information,this paper proposed a text classification method based on label attention mechanism.This method learned the context vector representation of input text and label semantics through BERT,and then used attention mechanism to capture the semantic correlation between label semantics and text information,highlighting text features related to label information,and improving the classification ability of the model.Experimental results show that this method exhibited good performance in multi label text classification tasks. |