Font Size: a A A

Abstract Meaning Representation Parsing with Rich Linguistic Feature

Posted on:2018-01-30Degree:Ph.DType:Dissertation
University:University of Colorado at BoulderCandidate:Chen, Wei-TeFull Text:PDF
GTID:1448390005951541Subject:Artificial Intelligence
Abstract/Summary:
Lexical and syntactic information have been shown to play important roles in semantic parsing. However, there is still no solid research on the relationship between semantic parsing and different types of linguistic knowledge that support this, e.g., lexical cues, dependency structures, semantic roles, etc. It is also known that dependency structures provide rich syntactic information for various NLP applications. Yet, few applications use dependency structures in an underlying neural network framework. This dissertation introduces a complete framework designed to parse Abstract Meaning Representations (AMRs), a semantic representation that expresses the meaning of a sentence as a directed acyclic graph. To enhance our AMR parser, we first develop a light verb construction (LVC) detector using a SVM. We also link input dependency parses to AMR concepts taking an EM-based approach to generate alignment pairs.;The main parser is split into three sub-components: a frame identifier, a concept identifier, and a transition action identifier. To support these components, we develop a Recursive Neural Network (RevNN) based model as the underlying framework of all three components. RevNN is based on dependency structures combined with distinct linguistic features. RevNN generates a corresponding vector representation for each dependency node, passing these vectors to the three identifiers as the underlying framework. By integrating all the above components, we design a transition-based parser which generates AMR graphs from input dependency parses.;Results show that our LVC detector surpasses comparable systems by 3 to 4% in F1 score, and that this LVC detector supports the AMR parser. Our aligner improves F1 score by 2 to 5% with LVCs information. Moreover, the resulting AMR parser achieves the best Smatch scores among other transition-based AMR parsers. We also show that the RevNN framework helps to integrate different linguistic features for improvement in accuracy of individual components.
Keywords/Search Tags:Linguistic, Parsing, AMR parser, Framework, Dependency structures, Meaning, Representation, Components
Related items