Font Size: a A A

A Study On The Influence Of Prior Knowledge On NN Model In Several NLP Tasks

Posted on:2018-02-19Degree:MasterType:Thesis
Country:ChinaCandidate:C BeiFull Text:PDF
GTID:2348330518482357Subject:Software engineering
Abstract/Summary:PDF Full Text Request
With the development of the Internet and the updating of computer hardware, the neural network model has been developed rapidly in recent years, and has been widely used in image recognition and speech recognition. The application has become more mature. In the field of natural language processing (NLP), neural networks is also hotspots.Before that, NLP tasks mostly require more NLP knowledge, especially with the method of the rules. While statistical-based NLP methods do not require a higher linguistic requirement as a rule-based approach for researchers, there is still a need for knowledge of natural language processing. However, when using the neural network model to deal with NLP tasks, it does not need a lot of knowledge about NLP. First, use the word vector layer to represent the input text, and then use the neural network model to extract the characteristics of the text to complete the NLP tasks. It just need researchers to understand the knowledge of the neural network, which can handle most of the NLP tasks. However,the use of neural network model to deal with these problems can be adapted to a variety of tasks, but its effect is difficult to exceed the traditional method, the use of neural network model is more because it is easy to use, but the effect is only close to the traditional method.Part of the reason is that, it is rare to add a priori knowledge in the neural network model.The existing researches always extract features directly from the text and these features do not achieve the desired results.Therefore, this study explores the influence of linguistic prior knowledge on NLP tasks using neural network models. For different tasks, we compare the different prior knowledge and the different prior knowledge input position (the different positions of the model are mainly for machine translation tasks) for the neural network model. We experimented with the basic tasks of NLP, text classification and machine translation. It is found that the influence of different prior knowledge on different neural network models is different in some tasks. At the same time, using the appropriate neural network model on some tasks, adding appropriate prior knowledge can speed up the model convergence and improve the model effect.
Keywords/Search Tags:neural network, prior knowledge, natural language processing, basic task, text classification, machine translation
PDF Full Text Request
Related items