Font Size: a A A

Intelligent Question Answering Of Deep Recurrent Neural Network Based On Self-Attention Mechanism

Posted on:2020-10-13Degree:MasterType:Thesis
Country:ChinaCandidate:C ZhangFull Text:PDF
GTID:2428330590482230Subject:Software engineering
Abstract/Summary:PDF Full Text Request
With the rapid spread of Internet,users find it more convenient in terms of the acquisition of knowledge and information.But at the cost of data overload caused by explosive data growth,which reduces the efficiency of user access to information.Question answering can rapidly capture related information asked by user,while the traditional model has two main disadvantages:(1)It relies on refined feature engineering.This featured extraction strategy for artificial custom requires professional knowledge and has strong subjectivity,which has poor capability within various situations.(2)Traditional methods of question answering are built on statistical features like the frequency of words found,and it cannot capture the context which plays an important role in text analysis.With the rapid development in recent decades,deep learning has achieved significant break in many research fields,which proves that the feature expression skills within depth model times than shallow model in be index.In addition,the introduction of various attention mechanisms improves the ability of depth model to express the meaning of complicated texts.Based on these,this article focuses on how to use depth model and attentional mechanisms to deal with question answering.The main research includes:(1)For features extraction questions,this article designs a bi LSTM model for feature extraction built on attention,which can be applied in question answering.bi LSTM net model solves the Vanishing gradient problem in Recurrent Neural Networks(RNNs)though adding three “Gates” structure.Bi LSTM model respectively extracts features from positive and negative two directions and thus,and it solves the question of shortage of context information in features.(2)This article compares the effectiveness of several classical attention mechanisms on question answering.It adds self-attention mechanism in models that focus on optimal performance into depth models for question answering.self-attention mechanism merges important information into high-level characteristic expression of texts though the calculation of distribution vector within the importance of different vocabulary in text sequence.This mechanism avoids excessiveness of information in extraction of semantic features to some extent,and highlights important information in text sequence.This article proves the model on public data sets like Insurance QA.The results of trail show that the deep Recurrent Neural Network based on self-attention mechanism in this paper is better than other question answering algorithm based on deep learning in terms of question answering tests.
Keywords/Search Tags:Question Answering, Deep Learning, LSTM, Self-Attention
PDF Full Text Request
Related items