Font Size: a A A

Rearch On Chinese Named Recognition Based On Deep Learning

Posted on:2020-05-26Degree:MasterType:Thesis
Country:ChinaCandidate:C GongFull Text:PDF
GTID:2518306548994029Subject:Management Science and Engineering
Abstract/Summary:PDF Full Text Request
With the development of Internet technology and information technology,the traditional search method is gradually replaced by the search engine supported by the knowledge graph which can not only provide users with more accurate results but also provide other associated knowledge.As an important foundation work for building knowledge graph,named entity recognition has always been a subject of high attention.Compared with the research on named entities in English,the level of research on entity extraction technology for Chinese is relatively lagging.Most of the methods are based on the research work of English,and some artificial language features are added to realize the recognition and extraction of entities.This is because compared with English,there are certain specialities in Chinese,including the diversity of named entity types,the lack of natural separators,and some nesting phenomena that often occur in Chinese names.These specialities make the current Chinese named entity recognition huge room for improvement.This paper takes the Chinese named entity recognition based on deep learning as the main research content,and studies the existing named entity recognition methods in Chinese and foreign languages.It focuses on the deep learning named entity recognition technology and summarizes the difficulties of the current Chinese named entity recognition.This paper introduces the current popular deep learning sequence annotation model—the neural network+CRF model as the benchmark model,and on this basis,we improve it to make it better applied to the Chinese named entity recognition task.Firstly,in order to weaken the dependence of the model on artificial features and improve the ability of the model to capture features,this paper adds a multi-headed self-attention mechanism to the current BiLSTM+CRF baseline model,which can highlight the role of keywords in the extraction and prevent the loss of key information by assigning weights.At the same time,in order to extract deeper features,this paper uses multi-layer neural network.In order to improve the gradient vanishing and gradient exploding problems,this paper applies a dense connection network connection method to Chinese named entity recognition model,which makes the model convergence faster.In this paper,the proposed model is evaluated on Chinese datasets and compared with existing models.Experiments show that the model is feasible and can achieve more advanced results.This paper also compares the dense connection model with the direct stacking neural network model.The experiment shows that the dense connection model can achieve better results and has better convergence.Then,this paper introduces another Chinese Named Entity Framework Based on GRU,which uses Bi GRU + CRF as the main part of the model.Compared with LSTM network,GRU network has a simpler network structure and easier training.In order to improve the efficiency of extraction,this paper uses a new word vector representation method BERT,which is a dynamic word vector.Compared with the current mainstream word vector,it can provide more abundant semantics and enable the model to capture more abundant Chinese text features.In this paper,the BERT has a good effect on the model in a certain degree.Through the experimental verification,the model proposed in this paper achieves higher F1 value,and can achieve more advanced results compared with other models.In conclusion,this paper proposes two Chinese named entity recognition frameworks based on deep learning,which have high recognition efficiency in general data sets,good portability and generalization ability,and are relatively feasible Chinese named entity recognition frameworks.
Keywords/Search Tags:Chinese Named Entity Recognition, Multi-head Self-attention mechanism, Dense Connection, BiLSTM, GRU, BERT
PDF Full Text Request
Related items