Font Size: a A A

Research On Recurrent Neural Network Based Dependency Parsing Model

Posted on:2017-01-27Degree:MasterType:Thesis
Country:ChinaCandidate:J C ZhangFull Text:PDF
GTID:2348330485481332Subject:Systems analysis and integration
Abstract/Summary:PDF Full Text Request
Dependency parsing is a fundamental work in Natural Language Processing(NLP).The results of a dependency parser directly affect the capable of a NLP system to analyze sentence meanings.Traditionally,statistical features in training corpus are represented with strings.In this way,the performance of dependency parser mainly relies on hand-crafted feature templates which have data sparsity problem and result in costing more memory and computing resources.When masses of features are applied,the model has the problems of overfitting and slowing down the parsing speed.To solve the problems above,we proposed a novel dependency parsing model based on words and part-of-speech distributed representation technology in deep learning.The main research contents are as follows.In this thesis,we first introduced the key concepts and traditional analysis methods of dependency parsing.And then,we analyzed the principles of neural network language models and word embedding technologies.Finally,a typical dependency parsing model based on neural network was described,which can obtain good efficiency and accuracy by using distributed representation.However,its parsing accuracy can be further increased.We found that the existing transition based parser method in dependency parsing model based on neural network just utilize the context information of the limited-length window and some useful long distance information is neglected.So we proposed and implemented a recurrent neural network(RNN)based dependency parsing model which improved distributed representations of words and part-of-speech by using the bidirectional long short-term memory neural network as no-linear layer of network architecture.Our experiments demonstrate that proposed method could reduce the complexity of system design,and improve the performance of dependency parsing.Furthermore,a Layer based Encoder-Decoder Dependency Parsing Model was proposed and implemented,which integrated the pros of traditional transition and graph based methods in dependency parsing.In our model,the entire sentence was encoded into a low-dimensional vector.In decoding phase,current classification label was conditioned on context features and the summary vector of the input sentence.Experiments on public data sets PTB and CTB illustrate that our model can effectively improve the parsing speed and analysis accuracy.
Keywords/Search Tags:Dependency Parsing, Distributed Representation, Recurrent Neural Network, Sequence Labeling, Encoder-Decoder
PDF Full Text Request
Related items