Font Size: a A A

Research On Key Technologies For Natural Language Understanding

Posted on:2020-04-29Degree:MasterType:Thesis
Country:ChinaCandidate:C R WangFull Text:PDF
GTID:2518306548995979Subject:Computer Science and Technology
Abstract/Summary:PDF Full Text Request
Natural language understanding is a basic research work in Task-based humancomputer dialogue system.For a given user input,it aims to transform natural language into a structured semantic representation that can be understood by computer.Natural language understanding consists of intention recognition and slot filling,in which intention recognition completes the judgment of user's intention and slot filling completes the extraction of user's input entity.Usually,multi task learning is used to complete two tasks at the same time.At present,related technologies have become the focus of academic research.Based on the deep neural network model,the task of natural language understanding is studied in this paperFirst of all,natural language understanding tasks express sentence semantics by using neural networks for semantic combination.This paper analyzes two problems existing in the existing semantic combination methods:(1)using the same semantic combination at each position of sentence input,unable to capture rich combination semantics and lack of expression ability;(2)whether the model can converge to a better oneThe solution depends on the artificial learning rate.If it is too small,it will not converge or even diverge.In view of the above problems,a cross task semantic composition model based on meta learning is proposed,which can effectively improve the performance of natural language understanding tasks.The basic ideas are:(1)using shared meta network to capture the meta knowledge of semantic composition;(2)generating the parameters of task model directly from meta network;(3)using LSTM model as meta network,in which each of the meta network The output of time steps updates the parameters of the task model.The experimental results show that the proposed method can effectively solve the problems existing in the existing methods,and achieve 95.36 % and 91.76 % performance effects on the two open data sets ATIS and snips,respectively.Secondly,this paper solves the problem that the existing methods do not fully understand the semantics and do not deeply mine the context semantic relevance.There are two problems in the existing natural language understanding model:(1)using the shared representation layer to realize the multi task learning of slot filling and intention recognition,without considering the semantic relevance of intention and slot label in the semantic layer,which results in the performance degradation of semantic understanding;(2)neglecting the semantic relevance between contexts in the semantic layer,resulting in the loss of information The solutions of the above problems are as follows:(1)using gating mechanism to establish the semantic connection between two tasks,and using gating network to integrate the information of intention and semantic slot to understand the semantics;(2)using self attention mechanism to integrate the semantic information of the full text for each word in the sentence,so as to avoid the loss of information.Experiments show that the method proposed in this paper can effectively solve the problems of semantic understanding and contextual semantic relevance in natural language understanding tasks.It can bring the highest accuracy improvement of 4.7 % and 11.67 % in ATIS and snips datasets.
Keywords/Search Tags:Natural Language Understanding, Slot Filling, Intention Detection, Meta Learning, Pre-training Language Model
PDF Full Text Request
Related items