With the advancement of digitalization and intelligence transformation in various industries of society,State Grid Corporation of China,as a backbone enterprise that guarantees the national economy and provides power services,actively responds to this trend.Improving the automation level and service quality of power supply is of great importance,and the key lies in timely and effective communication with customers,as well as identifying their intentions and purposes.In particular,power consumption services are highly specialized,and customers use various dialogue methods.Therefore,designing models that can accurately identify users’ business intentions and applying them to intelligent applications to improve customer service efficiency is a highly challenging requirement.In the research process of this paper,we first studied the existing theories and models in the field of natural language understanding(NLU)and familiarized ourselves with the main solution paradigms of current research directions.We summarized the reasons for the proposal of each method,the problems it solves,its advantages,and its shortcomings.We selected experimental datasets based on practical application scenarios,understood the main standard datasets in this field,and additionally select suitable Chinese dialogue datasets based on the characteristics of the Chinese scene for experimentation.After verifying the model’s performance on the dataset,we transferred the model to the power dataset for testing and deployment,and developed engineering code to build a solution platform.This paper specifically proposes two research parts to address the issues observed in short-text interaction scenarios,including an intention-slot implicit relationship extraction model based on improved attention and gate decoding,and an intention-slot explicit relationship fusion encoding model based on ERNIE and graph neural networks.The main contributions of the paper are as follows:1.Due to the problem of insufficient utilization of intention-slot information in the decoding stage and low detection accuracy,we propose an intention-slot implicit relationship extraction model based on improved attention and gate decoding.One intention attention module and another slot attention module are used as the backbone network.Model use selfattention mechanism and introduces gate unit channels for improvement.The model is tested on the ATIS and SNIPS short-text public datasets,and the results show that our model achieves an accuracy improvement of around 1.0%compared to the mainstream Attention-Based models currently in use.2.On the basis of implicit relationship modeling,the model proposes a novel encoding method that integrates ontology knowledge using graph convolutional networks.Meanwhile,the model uses the ERNIE encoding layer that integrates semantic knowledge as the pre-training model.The above two parts are combined as the relationship explicit and implicit modeling model,forming a complete joint learning model.The model is tested on the Chinese public short-text dataset ECDT-NLU,and compared to the current SOTA model,our model achieves improvements of 23%,15%,and 41%in slot accuracy,intention accuracy,and joint accuracy,respectively.3.We have constructed an electricity network business knowledge graph using a method based on large-scale triple extraction.We propose an overall architecture for a task-oriented multi-turn question-answering(QA)system and design and implement a one-stop platform solution that includes multi-turn QA,permission allocation,and system monitoring.The system is built using Spring Cloud and Django as the server-side framework and Neo4j as the knowledge graph storage.After testing,the model has demonstrated high feasibility and stability. |