| Natural Language Understanding is one of the most fundamental and most significant problems in Natural Language Processing(NLP).Conventionally,the understanding of natural language is a bottom-up process including word segmentation,part-of-speech tagging,named entity recognition,syntactic parsing,and eventually semantic analysis.However,as a morphologically poor language,Chinese syntactic parsing performance has always been lower than other morphologically rich languages.Additionally,conventional tree-structured annotation used in syntactic parsing has shown its incapability in capturing complicated semantic relations in a sentence.Therefore,graph-structured semantic annotation is receiving growing interest in recent years.Semantic dependency graph parsing aims at analysing semantic relations between words in a sentence and answers the questions "who did what to whom when and where".Semantic dependency graph parsing is based on dependency theory and has the advantages of easy to understand and use.Compared to conventional syntactic parsing,semantic dependency graph parsing reveals the deep semantic meaning of a sentence,and thus is helpful for NLP tasks that require semantic information,such as machine translation,question answering and information extraction.The introduction of graph structure has improved the ability to capture complicated semantic relations.It also makes the parsing of semantic dependency graph more challenging.As an emerging task,both the parsing approach and the application method of semantic dependency graph are yet to be explored.This paper focus on the parsing approach and application method of Chinese semantic dependency graph and conduct research on the following four directions:1.Words with multiple heads are hard to deal with in semantic dependency graph parsing.To solve this problem,we propose a transition-based approach with transition actions that generate multiple heads for a word,which can generate semantic dependency graphs automatically.Besides,to efficiently and accurately obtain the information of transition states for the prediction of the next transition action,we propose the Bi-LSTM Subtraction module and the Incremental Tree-LSTM module to model the list and partially generated sub-graphs respectively.Experimental results show that our method effectively outperforms previous work.2.Existing semantic dependency graph parsers are vulnerable since their predictions can be easily misled by small perturbations in the input.To solve this problem,we firstly propose an adversarial attack algorithm against existing dependency parsers to generate high-quality adversarial examples that mislead the parsers.Then we use the algorithm to analyse the robustness of existing dependency parsers.Based on the findings,we improve the robustness of existing parsers with adversarial training and model ensembling.3.Currently,manual annotation of semantic dependency graph only exists on a few languages and the annotation is hard to obtain for most languages in the world.To solve this problem,we first modify and simplify the semantic dependency graph annotation guidelines for Chinese and make it applicable to other languages.Then,we propose methods based on label switching and graph neural network that automatically convert existing multilingual universal syntactic dependency tree annotation to semantic dependency graph annotation.Moreover,we combine the proposed method with crosslingual pre-trained models and succeeded in parsing languages with no manually annotated semantic dependency graph.4.Existing research on semantic dependency graph parsing mostly focused on the construction of datasets and parsing model.However,the eventual goal of semantic dependency graph parsing is to help other NLP tasks semantically.To achieve this goal,we propose a character-level dependency-enhanced network which tightly combines character-level semantic dependency graph with pre-trained models and significantly improves the results on Chinese semantic role labeling and relation extraction.Overall,this paper starts at building a Chinese semantic dependency graph parser,and explores the methods to improve the robustness of neural parsers and techniques for cross-lingual semantic dependency graph parsing.Eventually,the semantic information in semantic dependency graph is incorporated into pre-trained models and thus helps to improve the performance of other NLP tasks.Our research verifies that Chinese semantic dependency graph can effectively help other NLP tasks and proposes a unified framework for the purpose.In the future,we expect to apply our research to more NLP tasks,thus promoting the development of the NLP field. |