Font Size: a A A

Research On Dependency Tree Kernel-based Semantic Role Labeling

Posted on:2013-01-04Degree:MasterType:Thesis
Country:ChinaCandidate:B K WangFull Text:PDF
GTID:2248330371993524Subject:Computer application technology
Abstract/Summary:PDF Full Text Request
Automatic semantic parsing has always been one of the main goals of natural language understanding. Due to the difficulty in deep semantic parsing, previous work maninly focuses on shallow semantic parsing, which only labels predicate related constituents with semantic roles in a sentence, such as agent, patient, time, place, and so on. As a particular case of shallow semantic parsing, semantic role labeling (SRL) has many advantages, such as clear definition and easy to evaluate. More and more researchers have paid much attention on it in recent years. And SRL has been applied in many deep natural language processing applications, such as machine translation, information extraction, question answering, and so on.Generally, there are two kinds of methods for SRL. One is feature-based methods, which map a predicate-argument structure to a flat feature vector. The other is tree kernel-based methods, which represent a predicate-argument structure as a parse tree and directly measure the similarity between two predicate-argument parse trees instead of the feature vector representations. Feature-based methods have been consistently performing much better than kernel-based methods and represent the state-of-the-art in SRL. However, as more and more features have been added, the interaction among features has become more and more serious. It makes the growth trend of system performance gradually slowing down and reaching an upper bound. Tree kernel-based methods have the potential in better capturing structured knowledge in the parse tree structure, which is critical for the success of SRL, than feature-based methods. In the literature, however, there are only a few studies (Moschitti and Bejan,2004; Moschitti,2004; Moschitti et al.,2006; Zhang et al.,2007) employing tree kernel-based methods for SRL and most of them focus on the constituent parse tree (CPT) structure.In this paper, we explore a tree kernel-based method for Chinese SRL based on the dependency parse tree (DPT), with focus on:1. We explore a new syntactic parse tree structure, called dependency-driven constituent parse tree (DR-CPT). This is done by transforming DPT to a new CPT-style structure, using dependency relation types instead of phrase labels in the traditional CPT structure. In this way, our tree kernel-based method can benefit from the advantages of both DPT and CPT.2. Tree kernel-based semantic role labeling with dependency relation-driven constituent parse tree in Chinese language for nominal predicates. Three schemes are designed to extract various kinds of necessary information, such as the shortest path between the nominal predicate and the argument candidate, the support verb of the nominal predicate and the head argument modified by the argument candidate, from the DR-CPT structures. Evaluation on Chinese NomBank shows that our tree kernel-based method on the novel DR-CPT achieves comparable performance with the state-of-art feature-based SRL ones.3. To further indicating the effectiveness of the novel DR-CPT structure for better representation of dependency relations in tree kernel-based methods, we also do the experimentation on the CoNLL2009Chinese corpus for vebal predicates. We complete the SRL-only shared task of CoNLL2009including predicate disambiguation and SRL. We also propose three information extraction schemes. Evaluation on the CoNLL2009corpus shows that our method performs less than the best results of CoNLL2009by only0.13.
Keywords/Search Tags:Semantic Role Labeling, Dependency Parse Tree, Convolution Tree Kernel, Natural Language Processing
PDF Full Text Request
Related items