Font Size: a A A

Research On Cross-lingual Spoken Language Understanding

Posted on:2022-03-25Degree:MasterType:Thesis
Country:ChinaCandidate:Q X LiFull Text:PDF
GTID:2518306572459764Subject:Computer technology
Abstract/Summary:PDF Full Text Request
With the rapid development of natural language processing technology in recent years,artificial intelligence has penetrated into all aspects of our lives.Various artificial intelligence products have appeared on the market,and people have gradually placed higher and higher expectations on computers.Hope that computers can help people complete more and more complex tasks.Thanks to the development of deep learning and big data,some task-oriented dialogue voice assistants have gradually appeared in people's daily lives.These conversational voice assistants can not only accompany users to small talk,but also intelligently solve people's many daily needs,such as listening to music,making phone calls,and booking tickets.The main technology behind it is dialogue language understanding.At present,researchers at home and abroad have made a lot of contributions to the task of spoken language understanding in Chinese and English,but there are few studies in the cross-lingual field,which makes people in some smalllanguage countries have to speak English or Chinese in order to use voice assistants,thus causing great inconvenience.In order to solve this problem,this subject is dedicated to the research of dialogue language understanding tasks in cross-language scenarios applicable to multiple minor languages,so as to fill the gaps in current research in this field.First,we constructed two sets of datasets on cross-lingual spoken language understanding tasks for model training and verification,and designed pipeline and end-to-end models for related experiments on spoken language understanding tasks.Secondly,we designed a zero-shot transfer learning algorithm in English for crosslingual scenarios,and proved the superiority of the algorithm through a series of experiments.Finally,we designed a set of pre-training methods for cross-lingual models in the field of task-oriented dialogue with language and tasks as goals,and proved the effectiveness of this pre-training method through a series of experiments.By combining our designed pre-training method and zero-shot transfer learning algorithm,the overall performance of the cross-lingual spoken language understanding model can be greatly improved.
Keywords/Search Tags:Cross-lingual, Spoken Language Understanding, Zero-shot Transfer Learning, Pre-trained Model
PDF Full Text Request
Related items