Font Size: a A A

Research On The Optimization Technologies Of Weighted Neural Network Models For Question Answering System

Posted on:2020-02-16Degree:MasterType:Thesis
Country:ChinaCandidate:H YuanFull Text:PDF
GTID:2428330575989315Subject:Computer application technology
Abstract/Summary:PDF Full Text Request
The rapid development of information technology has brought about an explosive growth in the amount of data.The question answering system focuses on searching answer from the vast amount of information.The answer selection type of question answering system selects the output answers by filtering and sorting the answers that meet the requirements.To address these issues that missing natural science scenes,insufficient logical reasoning ability and unreasonable distribution of feature weights,the main work of this thesis is as follows:Firstly,an Attention-based LSTM model is proposed to optimize the answer selection question answering system in the scientific test scenario.This model can select the best answer from the four candidate answers according to the question.Without the help of the external knowledge base,the accuracy of 35.3%and 46.5%is obtained for different data subsets of English and Chinese respectively.Then,an Attention-based CNN-LSTM model is proposed to optimize the answer selection question answering system in the scientific test scenario.This model uses the given questions and related knowledge base to find the correct answer from the two candidate answers.The experimental results show that the classification accuracy of the model reaches 71.4%.Finally,a Bi-directional LSTM model based on TF-IDF weights is proposed to solve the problem of unreasonable information weight distribution in attention mechanism.This model increases the classification accuracy of the English and Chinese data subsets to 41.2%and 52.3%.At the same time,the accuracy rate of the reading comprehension scene is increased to 78.3%.At last,we improve the accuracy to 81.1%through the model integration method.
Keywords/Search Tags:Question answering, Answer selection, LSTM, Attention mechanism, TF-IDF weights
PDF Full Text Request
Related items