Font Size: a A A

Research On The Key Issues Of Event Coreference Resolution

Posted on:2021-03-09Degree:MasterType:Thesis
Country:ChinaCandidate:R Y WuFull Text:PDF
GTID:2428330605474863Subject:Computer technology
Abstract/Summary:PDF Full Text Request
Event coreference resolution plays an important role in Natural Language Processing(NLP).It aims to find the descriptions of some real events for a given unstructured text,and to determine which descriptions refer to the same event in the real world.It is helpful for other downstream NLP applications,such as information extraction,reading comprehen-sion,and event timing relation identification.In recent years,deep learning is very popular and make many breakthroughs in vari-ous NLP tasks.In comparison with traditional machine learning approaches,deep neural networks can use dense vectors to represent similarities between semantics.In this paper,we first propose a neural event coreference resolution model.Then we improve the model referring the structural characteristics and self-properties of the events.Our contribution includes:(1)We propose an end-to-end event coreference resolution model to address the error propagation problem of the pipeline model.At the same time,it does not need to manual-ly construct many features like the joint learning model.Only by using word embedding and part-of-speech embedding,combining with long-short term memory network and at-tention mechanism,our model can effectively conduct this task and achieve satisfactory performance on the KBP corpus.(2)In comparison with entity coreference resolution,event coreference resolution is more challenging.Literally,the amount of event mentions makes up only a small portion of the entire text.Many words are irrelevant to the event mentions.But the events are closely related to the topic of the text.We propose a document embedding approach and incorporate the document representation into our neural model.Our document embedding approach uses a hierarchical attention mechanism to construct representation of the entire text content,and uses serialization tagging method to distinguish different content.In this way,we can extract more correct event mentions.Experiment results show the effectiveness of our document embedding approach.(3)Among event coreferential chains,singletons occupy the majority.This case greatly increases the diffi culty of event coreference resolution.In order to remove these singleton-s,we improve our neural model by enhancing the representation of events.In particular,the role information and the coarse-grained category information of events are incorporated.Moreover,the biaffine attention mechanism is employed to calculate the coreferential con-fidence of event mention pairs.Experiment results show the effectiveness of the enhanced representation of events.
Keywords/Search Tags:Natural Language Processing, Event Coreference Resolution, Deep Learning
PDF Full Text Request
Related items