Font Size: a A A

Research And Application On Military Relation Extraction Method Based On Pre-training Model

Posted on:2023-02-11Degree:MasterType:Thesis
Country:ChinaCandidate:Y N JiangFull Text:PDF
GTID:2556306839994829Subject:Computer Science and Technology
Abstract/Summary:PDF Full Text Request
With the advent of the information age,it is easier for people to obtain more knowledge,and the amount of military data is increasing with a blowout.To alleviate the pressure of analyzing data when intelligence analysts facing an increasing amount of data,more and more natural language processing technologies are applied to the military field to facilitate the extraction of valuable information from military text.Among numerous natural language processing technologies,relation extraction technology,which is a basic technology,plays an important role in obtaining information.This paper uses the relation extraction method based on the pre-training model to extract relation from military text and makes a research on its practical application.Due to the lack of military text relation extraction dataset,this paper first constructs a military text relation extraction dataset,which includes negative samples and twelve relation types: colleague,native place,subordinate(person),position,use,location(location),location(organization),cooperation,subordinate(organization),manufacturing,equipment and outfit.Subsequently,this paper carries out model experiments and analysis on this dataset.Taking BiLSTM-Attention model as the baseline model,this paper constructs BiGRU-Attention model,BERT_EM-BiGRU model,BERT_EM-BiGRU-Attention model,and the entity marker is used in the model to integrate the position information of entities into the input and output of BERT model.Among them,the BERT_EM_BiGRU-Attention model peforms best,indicating that attention mechanism can help the model pay attention to the important information.The F1 value can reach 90.72%,which is about 7% higher than that of the baseline model.However,BERT model masks words in units and does not take the relationship between words into account.In addition,the local information in the input text statement also plays an important role in relation extraction.How to better integrate the semantic information into the model and how to integrate and strengthen the local information in the sentence is an urgent problem to be solved.In this paper,BERT_wwm,BERT_wwm_ext,Ro BERTa_wwm_ext,Ernie and BiGRU-Attention are combined and compares,in which Ro BERTa_wwm_ext performs best.In addition,by using the combination of bi-recurrent neural network and multi-size convolutional neural network,this paper constructs Ro BERTa_wwm_ext_EMBiGRU-CNN model,the model performs best in the dataset,and the F1 value reaches91.96%.Finally,this paper makes a research on the application of military text relation extraction and uses the proposed model to extract relation of the text and visualize the result.Then,Neo4 j database is used to store the dynamic ontology diagram,and Page Rank algorithm is used to compress the dynamic ontology and Deep Walk algorithm is used to analyze the event correlation in the information analysis based on dynamic ontology.
Keywords/Search Tags:Military, Intelligence analysis, Relation extraction, BERT, BiGRU, Convolutional neural network
PDF Full Text Request
Related items