Font Size: a A A

Parse Tree Based Neural Networks For Semantic Relatedness

Posted on:2020-03-28Degree:MasterType:Thesis
Country:ChinaCandidate:R Q YangFull Text:PDF
GTID:2428330575457995Subject:Computer technology
Abstract/Summary:PDF Full Text Request
Deep neural networks are dominant in the research of semantic relatedness.Parse tree based neural networks,among various deep neural networks,model natural lan-guages based on their intrinsic parse tree structures.Vector representations learned by parse tree based neural networks are later used in semantic relatedness evaluation net-works to further determine the semantic relatedness of the two input sentences.Parse tree based neural networks are more capable of modeling long-distance dependency and can leverage external parse tree structure annotations to enhance their performance.However,there are some limitations to existed parse tree based neural networks.For instance,recursive neural networks struggle to model dependency across multiple lev-els when the tree is going deep;overall representations of a whole sentence computed by recurrent recursive networks are relatively weak;modeling parse tree structures with ordered neurons can not directly leverage external tree structure annotations,etc.These issues limit the applications of parse tree based neural networks in the task of semantic relatedness.This thesis targets at issues in existed parse tree based neural networks.It proposes corresponding improvements and evaluates these improvements by experiments on the semantic relatedness task.Targeting at the issue of recursive neural networks struggling to model dependency across multiple levels,this thesis borrows the idea of gated network units in recurrent neural networks,adds gated units adapted from gated recurrent units to recursive neural networks,and provides two versions for dependency parse tree and constituency parse tree,respectively,namely Summed Gated Recurrent Units and Densely-Connected Gated Recurrent Units.Experiments show that this method has good performance and high efficiency in the task of semantic relatedness.Targeting at the issue of weak overall representations computed by recurrent re-cursive networks,this thesis borrows the idea of recursive units in recursive neural net-works,uses recursive units as merge cells in recurrent recursive networks,and proposes an enhanced bidirectional version.Experiments show that this method has outstanding performance in tree structures modeling of semantic relatedness.Targeting at the issue of inability to leverage external tree structure annotations in ordered neurons,this thesis expands gated recurrent units with ordered neurons,and explore the exploitation of massive unlabeled data based on this more efficient model by unsupervised language modeling pre-training.Experiments show that in the task of semantic relatedness,this method can leverage external unlabeled data to enhance model performance.Analysis of experiments shows that ordered neurons based on unsynchronized updates after unsupervised pre-training learn and transfer implicit tree structures.This method has a wide range of potential applications.
Keywords/Search Tags:Semantic Relatedness, Parse Tree, Recursive Neural Networks
PDF Full Text Request
Related items