Font Size: a A A

Research On Edge And Cloud Collaborative Computing Model And Algorithm Based On Deep Neural Network

Posted on:2022-05-09Degree:MasterType:Thesis
Country:ChinaCandidate:C Y ZhongFull Text:PDF
GTID:2518306731477704Subject:Computer technology
Abstract/Summary:PDF Full Text Request
With the rapid development of modern information technology such as 5g,industrial Internet of things and artificial intelligence,industrial intelligence application technology which combines industry 4.0 with artificial intelligence has been widely used.In order to support the development of industrial intelligence technology,the application of deep neural network(DNN)has become a common technology of industrial equipment scene.Due to the rapid increase of DNN tasks characterized by directed acyclic graph(DAG)in modern industry and the large amount of task computation,it is difficult to meet the delay and accuracy requirements of these critical DNN tasks at the same time.In recent years,the solution of this kind of problem is to accelerate DNN partition inference by combining cloud computing and edge computing,so as to realize efficient edge cloud cooperation.However,the dynamic network conditions and the uncertain availability of cloud computing resources bring great challenges to this collaborative inference,and it is difficult to guarantee the Qo S requirements of DNN tasks.In order to solve the above problems,this paper uses DNN partition inference technology and machine learning method to propose DAG type DNN on-demand accelerated partition inference scheme and adaptive early exit selection scheme,as well as the corresponding architecture and algorithm.The detailed research contents are summarized as follows:In order to meet the high-precision Qo S requirements of DAG type DNN task under the condition of low delay,a DAG type DNN on-demand accelerated partition inference scheme(DDPI)is proposed.Firstly,an intelligent partition inference architecture(EDDI)is designed to support DAG type DNN on-demand accelerated partition inference scheme.The framework introduces two novel components:(1)Evaluator,which estimates the runtime impact factor and assists the optimizer to build a suitable DAG implementation DDPI scheme;(2)The optimizer,selects the best early exit point and the best DNN partition scheme for partition inference at runtime to improve the quality of service while meeting the user-defined delay requirements.Secondly,it is proved in detail that the traditional DAG type DNN inference partition scheme can not ensure the best partition inference scheme under the actual setting.Finally,a DDPI scheme based on the minimum cut algorithm is proposed to achieve the best partition inference and effectively improve the model inference accuracy.In order to solve the problem of large time cost in online decision-making of DNN on-demand accelerated inference partition scheme,this paper further proposes an adaptive early exit selection scheme(AES).Firstly,based on Eddi architecture,a predictive classifier is introduced,and a collaborative inference architecture supporting adaptive early exit selection scheme is designed.Secondly,based on the open source edge computing framework Edge X,the running parameter management strategy of predictive classifier is proposed.Finally,an adaptive early exit scheme based on prediction is proposed to select the best early exit point adaptively,which can effectively improve the efficiency of online evaluation DNN inference algorithm.Finally,the feasibility and performance of the two schemes are evaluated by using Cifar data set.The experimental results show that compared with the baseline scheme and the most advanced scheme,DDPI scheme has advantages in model inference accuracy,inference delay and overall throughput;The AES scheme has a significant improvement in the algorithm decision time performance.
Keywords/Search Tags:Cloud computing, Edge computing, Edge intelligence, Deep neural networks, Computation offloading
PDF Full Text Request
Related items