Font Size: a A A

Semantic Role Labeling By Highway Bi-LSTM Model Based On Self-attention And Dependency Syntax

Posted on:2021-03-29Degree:MasterType:Thesis
Country:ChinaCandidate:Y L WangFull Text:PDF
GTID:2518306557487324Subject:Software engineering
Abstract/Summary:PDF Full Text Request
Semantic role labeling(SRL)is the foundation of natural language processing.Currently,deep learning methods are mainly used,such as Highway Bi-LSTM model.Although Highway structure in this model can accelerate the network convergence speed,following problems exist:1)The model mainly uses information such as word,part-of-speech,whether current word is a predicate,and rarely uses syntactic structure information,such as dependency syntactic structure.2)The model needs to calculate the corresponding time step according to the length of the sentence.It is difficult to establish a semantic connection between words that are far away in long sentences,resulting in low performance in long sentences.To deal with above problems,the main work of this thesis is as follows:(1)Aiming at the problem of rarely using syntactic structure information,DRT?Highway Bi-LSTM model is proposed.It mainly extracts the Dependency Relationship Type(DRT for short in this thesis)between words in a sentence,and integrates it into the word vector representation of Highway Bi-LSTM model.The experimental results show that,compared with the classic Highway Bi-LSTM model,the F1 value of DRT?Highway Bi-LSTM model on Chinese Proposition Bank(CPB)dataset is increased by 0.62;on the English CONLL2012 dataset,the F1 value is increased by 0.44.(2)Aiming at the problem of low performance in long sentences,we add Self-Attention(SA for short in this thesis)mechanism to Highway Bi-LSTM model and propose SA?Highway Bi-LSTM model.Self-Attention mechanism can establish a direct semantic connection for words that are far away in long sentences,which helps to improve the performance of model in long sentences.We use CPB and CONLL2012 data sets to construct long sentence test sets.The experimental results show that,compared with the classic Highway Bi-LSTM model,the F1 value of SA?Highway Bi-LSTM model on Chinese long sentence test set is increased by 1.15;the F1 value on English long sentence test set is increased by 0.09.Finally,we add DRT and SA to Highway Bi-LSTM model and propose DRT?SA?Highway Bi-LSTM model.The experimental results show that,compared with SA?Highway Bi-LSTM model,the F1 value of DRT?SA?Highway Bi-LSTM model is increased by 0.15 on CPB dataset;on CONLL2012 dataset,the F1 value is increased by 0.42.Compared with DRT?Highway Bi-LSTM model,the F1 value of DRT?SA?Highway Bi-LSTM model is increased by 0.09 on CPB dataset;on CONLL2012 dataset,the F1 value is increased by 0.04.
Keywords/Search Tags:Semantic Role Labeling, Highway Bi-LSTM, Dependency Relationship Type, Self-Attention mechanism
PDF Full Text Request
Related items