Font Size: a A A

Dependency Parsing Research Model Based On Deep Learning

Posted on:2020-12-07Degree:MasterType:Thesis
Country:ChinaCandidate:S Y LiuFull Text:PDF
GTID:2428330572981087Subject:Engineering
Abstract/Summary:PDF Full Text Request
With the rapid development of Internet technology and the popularity of globalization,a large number of corpuses are exploding on the Internet.Extracting valuable information from corpus and processing it accordingly has become an important research content in the current computer field.This natural language processing technology came into being.Parsing is a key technology in the field of natural language processing.The dependency parsing has been widely used in many fields because of its concise expression and the expression of the relationship between words.With the breakthrough progress in deep learning in various fields of computer in recent years,the dependency parsing model based on deep learning has brought new solutions to the problems in syntactic analysis.Aiming at the development trend of current dependency parsing,this paper proposes to design and implement a conjoint analysis model of dependency parsing based on deep learning.First,unified modeling of syntactic structures and sequence annotations based on Bi-directional Long Short-Term Memory network;Secondly,by combining part-of-speech tagging with depth map to deal with dependency parsing,which can reduce the spread of errors caused by part-ofspeech tagging in feature extraction to a large extent and solve the problem that multi-level features can not be obtained;Finally,multiple Multi-layer perceptrons are used to predict dependency arcs and tags respectively.Based on the dependency parsing model,the parsing system designed in this paper uses React,ES6 and SVG technology to realize the front-end user interface display,user login,text training,text prediction,dependency graph display and other functions,and realize the interaction between server and client data.The conjoint analysis model follows the four axioms of dependency grammar in theory.By dividing data sets and training conjoint models,the original text data can be predicted.Compared with the two commonly used criteria of dependency parsing performance evaluation,labeled dependency prediction and labeled dependency prediction,experiments confirm that the analysis method based on part-of-speech tagging adopted in this model is divided into two parts: labeled dependency prediction and labeled dependency prediction.The analysis accuracy is obviously improved compared with the former two.
Keywords/Search Tags:Natural language processing, Dependency parsing, Deep learning, Bi-directional long short-term memory network, Multi-layer perceptrons
PDF Full Text Request
Related items