Font Size: a A A

Deep Contextual Word Embedding In Natural Language Processing

Posted on:2019-07-23Degree:MasterType:Thesis
Country:ChinaCandidate:J D XuFull Text:PDF
GTID:2428330626452110Subject:Computer technology
Abstract/Summary:PDF Full Text Request
With the great enrichment of Internet life,the Internet is full of massive text information.How to process the massive text information flooded in Internet by artificial intelligence technology has become an important field in artificial intelligence.Since text can be regarded as a series of words,therefore symbols should be firstly represented as vectors that can be calculated by computer so as to process the text by artificial intelligence technology.The traditional neural network-based word vector representation can only represent each word as a unique vector.However,the polysemy of word is very common in natural language,and it is difficult for one vector to express different semantic meanings.The contextual word representation can change as its context changes and can alleviate such problem to some extent.However,existing models which can get contextual word vector representation all need pre-training,and such methods have some shortcomings,such as the data sets required by pre-training are often difficult to obtain and it's difficult to process encrypted data.This paper proposes a contextual word vector representation method without pre-training and can be embedded into any end-to-end model.This method uses a multi-channel temporal convolution network to extract the context features of words and fuse them with the vector representation of words,and finally gets the vector representation of contextual words.The proposed method successfully avoids the weakness of pre-trained based method and improves the quality of word representation by involving contextual information.We verify the effect of the proposed contextual word vector representation in text classification task and text intention matching task.The results show that the proposed contextual word representation outperforms basic word representation.
Keywords/Search Tags:Deep Learning, Contextual Word Embedding, Text Classification, Intent Matching
PDF Full Text Request
Related items