Font Size: a A A

The Research On Key Problems In Deep Dependency Parsing

Posted on:2020-03-11Degree:MasterType:Thesis
Country:ChinaCandidate:J X CaiFull Text:PDF
GTID:2518306185999909Subject:Computer Science and Technology
Abstract/Summary:PDF Full Text Request
Dependency parsing is one of the core tasks in natural language processing.It determines the sentence structure by analyzing the dependency relationship between words,which can be further used to promote many downstream tasks.Transition-based and graph-based parsers are two typical models used in dependency parsing.The former is based on abstract state machine,which constructs a dependency tree through a series of transition actions.The latter is based on graph theory,which finds the optimal dependency tree in the whole hypothesis space using maximum spanning tree algorithm.This paper explores the variants of the two methods,namely the easy-first model and the head selection model,and proposes effective neural network models to optimize them.Easy-first parsing incrementally constructs the complete dependency tree by subtree re-ranking,where the intermediate state of parsing can be represented by various subtrees.Since the internal structural information of these subtrees is the key lead for later parsing action decisions,we explore an effective encoding method using deep learning model.Specifically,Chapter 3 proposes an easy-first parsing system which is able to encode the dependency tree effectively.The evaluation on benchmark treebanks shows that a proper subtree encoder does promote the parsing process and is able to make the easy-first parser achieve promising results compared to previous works.Head selection model picks up a dependency head for each word in the input sentence,and then performs the maximum spanning tree algorithm on the results to enforce a wellformed dependency tree.Inspired by this idea,Chapter 4 employs a sequence-to-sequence model to generate the head position sequence for each input sentence by modeling the parsing process as predicting the relative position of the dependency head.To address the long-distance dependency problem and ensure a valid tree structure,we further introduce a sub-root decomposition over the input sequence and a beam search decoder with tree constraints.The proposed model provides a novel idea for dependency parsing to get rid of the transition system and graph-based algorithm.It achieves comparable results with traditional models on benchmark treebanks.
Keywords/Search Tags:Dependency Parsing, Neural Network, Tree Structure Rep-resentation, Sequence-to-Sequence Model
PDF Full Text Request
Related items