| The question answering(Q&A)system accepts questions posed by users in natural language and can query or reason about the answers required by users from a large amount of heterogeneous data.With the development of deep learning,more and more question answering systems base on deep learning have been proposed.But most models can explain their reasoning ability.Therefore,researchers are inspired by the way human memory,and many deep learning models based on memory modeling are proposed.These deep learning models based on memory modeling have strong reasoning ability and can explain their reasoning ability from the model structure.This paper studies the existing memory networks model and improves the existing model.This paper will improve the memory networks model of the memory network in the application question and answer system.The research in this paper mainly includes the following two aspects.(1)Densely connectivity memory networks is proposed.Through in-depth study of traditional end-to-end memory neural networks and relational reasoning models.We believe that the traditional end-to-end memory neural network can not achieve better results in relational reasoning tasks because of its multi-hop mechanism causing information loss and lack of structure to acquire relationship features.To this end,we propose densely connectivity memory networks,hoping to add dense connections,gating mechanisms and multi-layer perceptrons to the multi-hop mechanism of the traditional end-to-end memory neural network model.By using dense connections,it is possible to more fully consider the existing facts,combine the filtering mechanism with the gating mechanism,and use the multi-layer perceptron to obtain the relational feature of the existing facts,thereby improving the relational representation ability of the model and ultimately improving the relational reasoning ability of the model.In this paper,the existing memory neural network model is tested on the question and answer data set.The experimental results show that densely connectivity memory networks has strong relational reasoning ability.(2)Densely connectivity multi-head attention memory networks is proposed.In order to enable the model to complete multiple text reasoning tasks,an in-depth study of the working memory network was conducted.Since the reasoning module of working memory network destroys the chain of reasoning of progressive reasoning,the working memory network cannot solve the progressive reasoning task well.To this end,we propose to use the multi-head attention mechanism to enhance the feature representation ability of the model,using dense connections and linear transformation structures instead of inference modules.The multi-head attention mechanism separates the training of the word embedding matrix from the acquisition of attention.The model can acquire more complex attention patterns and enhance the ability of the model to complete multiple text inference tasks simultaneously.Dense connections can better consider the information output by each layer,and retain the hierarchical information of the inference process through different inputs.Linear transformations can be used to obtain relational features with the simplest structure.Faced with problems in the Q&A dataset,densely connectivity multi-head attention memory networks has better stability,faster convergence speed and faster training speed.This paper also compares densely connectivity multi-head attention memory networks with the working memory network on the task of progressive reasoning.The experimental results also show that the proposed model has more advantages in progressive reasoning.In addition,we also applied the proposed model to the visual question and answer.The experimental results show that the model of densely connectivity multi-head attention memory networks combined with the relational reasoning module has achieved better results. |