Font Size: a A A

Exploiting Dependency Parsing As An Auxiliary Task To Enhance AMR Parsing

Posted on:2020-02-23Degree:MasterType:Thesis
Country:ChinaCandidate:T Z WuFull Text:PDF
GTID:2428330578974167Subject:Computer software and theory
Abstract/Summary:PDF Full Text Request
Abstract Meaning Representation(AMR)is a domain-independent semantic rep-resentation of sentences,which uses a rooted directed acyclic graph to represent the semantics of sentences.The goal of AMR parsing task is to convert sentences into cor-responding AMRs.At present,the research on AMR parsing is still in its infancy.By analyzing the example of AMRs,we can find that AMR graphs are closely related to corresponding dependency trees.However,most of the existing AMR parsing models based on deep learning ignore the effective use of dependency information which can improve the performance of AMR parsing,while the existing dependency parsers are more accurate than AMR parsers.Therefore,we make an experimental study on how to integrate and utilize dependency information in AMR parsing tasks.Specif-ically,our work mainly includes the following three aspects:(1)The existing AMR parsing techniques are sorted out.different AMR parsing models are analyzed and compared in the aspect of how they use the information of dependency tree,and the shortcomings of current research are pointed out.At the same time,the mainstream dependency parsing models and multi-task learning technologies are sorted out as well.(2)The baseline models of dependency parsing and AMR parsing are designed and implemented.The baseline model of dependency parsing uses an encoder based on self-attention and predicts dependency relation by a deep biaffine classifier.In the base-line model fo AMR parsing,concept identification task is treated as serialized arnota-tion solved by LSTM-CRF model,and relation identification task is solved similarly to the dependency parsing model.Finally,the AMR graph is generated by a greedy de-coding algorithm.The experiment shows that the baseline model has good performance while the similarity between these two models is kept,which lays a foundation for the joint model.(3)A new method of using dependency information in AMR parsing task is pro-posed.A joint model for AMR parsing which integrates dependency parsing is estab-lished.The joint model uses parameter sharing to obtain general feature representation.Attention mechanism is introduced to make AMR parsing make full use of dependency knowledge.Loss function is improved to alleviate gradient imbalance,and a pre-trained language model is used to improve the performance.The experiment shows that the joint model has a good effect in improving the parsing performance of AMR.Compared with the traditional way,it can use the information of dependency tree more effectively.
Keywords/Search Tags:Abstract Meaning Representaion, AMR Parsing, Dependency Parsing, Joint Model
PDF Full Text Request
Related items