Font Size: a A A

Named Entity Recognition Based On LSTM With Hierarchical Residual Connection

Posted on:2022-08-06Degree:MasterType:Thesis
Country:ChinaCandidate:Y LiFull Text:PDF
GTID:2518306575965519Subject:Computer Science and Technology
Abstract/Summary:PDF Full Text Request
Named entity recognition(NER)is a fundamental task in information extraction and plays a crucial role in natural language processing applications.The named entity recognition task is to identify entities with specific meanings in sentences,and the task consists of two elements,one is to determine the entity type,and the other is to identify the boundary information of the entities.Currently,character-based training deep learning models are widely used in solving named entity recognition tasks,where Long Short-Term Memory(LSTM)is usually used as the module for extracting feature vectors,which plays a crucial role in extracting textual feature vectors because of its excellent ability to capture contextual information.However,for the task of identifying entity boundary information,the existing LSTM suffers from the problem of insufficient ability to express short-term information features.To address this problem,this thesis proposes the LSTM named entity recognition model based on hierarchical residual connectivity and the LSTM named entity recognition model based on hierarchical residual connectivity with lexical information,and the main research contents are as follows.1.In response to the problem of insufficient short-term information extraction by LSTM,this thesis proposes the LSTM model based on hierarchical residual connectivity.Firstly,the depth of the LSTM network is stacked using residual blocks to enhance the nonlinear fitting ability of short-term information features;secondly,the global information of the input sequence is encoded and then the activation function is dynamically adjusted to obtain more accurate activation values,thus enhancing the fitting ability of the network;finally,the number of layers of residual connections is dynamically adjusted to the input through the attention mechanism to enhance the fitting ability of the model.2.To further improve the ability of the network to capture local information,this thesis proposes an LSTM named entity recognition model based on hierarchical residual connections and lexical information.The model adds a local convolutional attention module to the hierarchical residual connectivity LSTM model and introduces a separate module for extracting lexical information features.Firstly,the character sequences are computed by convolutional network using different convolutional kernels to obtain the lexical information features under different window sizes;subsequently,the semantic information features within each window are captured using the self-attention mechanism,which makes up for the defect that convolutional network cannot mine semantic information.Experiments are conducted on three publicly available datasets,and the experimental results show that the LSTM named entity recognition model based on hierarchical residual connectivity and the LSTM named entity recognition model based on hierarchical residual connectivity with lexical information proposed in this thesis outperform most of the improved algorithms and achieve the better results on the Wei Bo corpus with F1 values of0.7001 and 0.6970,respectively.
Keywords/Search Tags:named entity recognition(NER), long short-term memory(LSTM), attention, lexical information features
PDF Full Text Request
Related items