Recently,neural network based methods have made remarkable progresses on various Natural Language Processing?NLP?tasks.However,it is still a challenge to model both short and long texts,e.g.sentences and documents.In this paper,we propose a Hierarchical Attention Bidirectional LSTM?HA-BLSTM?to model both sentences and documents.HA-BLSTM effectively obtains a hierarchy of representations from words to phrases through the hierarchical structure.We design two attention mechanisms:local and global attention mechanisms.The local attention mechanism learns which components of a text are more important for modeling the whole text,while the global attention mechanism learns which representations of the same text are crucial.Thus,HA-BLSTM can model long documents along with short sentences.Experiments on four benchmark datasets show that our model yields a superior classification performance over a number of strong baselines.Bi-directional LSTM?BLSTM?often utilizes Attention Mechanism?AM?to improve the ability of modeling sentences.But additional parameters within AM may lead to difficulties of model selection and BLSTM training.To solve the problem,this paper redefines AM from a novel perspective of the quantum cognition and proposes a parameterfree Quantum AM?QAM?.Furthermore,we make a quantum interpretation for BLSTM with Two-State Vector Formalism?TSVF?and find the similarity between sentence understanding and quantum Weak Measurement?WM?under TSVF.Weak value derived from WM is employed to represent the attention for words in a sentence.Experiments show that QAM based BLSTM outperforms common AM?CAM?[1]based BLSTM on most classification tasks discussed in this paper. |