The Industrial Internet of Things uses Internet of Things and information technology to collect,transmit,analyze,and apply industrial data with industrial equipment as the core,thereby improving industrial production efficiency.It is an important promoter of industrial informatization upgrading.In the Industrial Internet of Things,a large amount of data needs to be calculated and used for real-time control.Cloud computing,as a solution,can provide powerful computing and storage capabilities.Devices can offload tasks that require a large number of computing resources to the cloud for execution,which can shorten the execution time.However,cloud computing also faces challenges such as high latency,network congestion and data security.Edge computing,as an supplement of cloud computing,can improve computing efficiency by deploying computing resources to the network edge close to the data source,meeting the needs of the Industrial Internet of Things in data analysis,real-time control and other aspects,while ensuring data security and privacy.However,task offloading in edge computing also faces many challenges.There are many types of tasks on the Industrial Internet of Things with different priority and deadline requirements.The edge nodes have limited and heterogeneous resources.Under such circumstances,it becomes an important issue to design an efficient computation offloading strategy to allocate the computation tasks among end devices and edge servers in a reasonable way to improve the performance.Therefore,it is necessary to make offloading decisions considering task attributes and system state to ensure real-time and effective tasks.Thus,this thesis addresses the problem of heterogeneous multitask computing offload strategy in the Industrial Internet of Things,considering the above-mentioned factors,and carries out research from both independent and non-independent tasks.The main work is as follows:For the problem of scheduling independent tasks with different priorities in the Industrial Internet of Things,a single-server multi-device edge computing model is considered in this thesis.The model divides terminal devices into two categories,one randomly generates tasks with different priorities,and the other is idle devices.Tasks can be computed locally on terminal devices,offloaded to edge servers,or forwarded to other idle devices for computation through D2D communication.A priority queue is set up on the edge server to reduce waiting time for high-priority processes,and high-priority tasks are always ordered before low-priority ones.Based on this,this thesis proposes a satisfaction indicator that comprehensively considers task priority,task deadline and task abandonment and proposes the problem of maximizing satisfaction.To solve this problem,this thesis discretizes the offloading decision process,models it as a Markov decision process and uses a reinforcement learning method based on Double Deep QNetwork(DDQN)to solve it.Simulation results show that the proposed algorithm outperforms the greedy and random strategies in terms of satisfaction level and average task time.The proposed algorithm can significantly reduce the average task time while ensuring the completion rate of high-priority tasks.For the problem of non-independent task scheduling with different latency constraints in the Industrial Internet of Things,this thesis proposes a Directed Acyclic Graph model of tasks and a multi-server multi-device edge computing scenario.In this scenario,each device is connected to a specific edge server and needs to execute multiple complex tasks.Some of the tasks are delay-sensitive.These complex tasks can be divided into subtasks,which can be executed locally,offloaded to the connected edge server for execution,or forwarded by the edge server to other edge servers for execution.To optimize the total task execution time and improve the utilization of system computing resources,this thesis proposes an optimization problem of minimizing the completion time of the last task.To solve this problem,this thesis proposes a Sliding Rank Scheduling algorithm(SRS)based on the heuristic algorithm.The algorithm estimates the initial rank of tasks through linear interpolation,iteratively adjusts the initial rank,then uses it for topological sorting to ensure that the scheduling results of all tasks meet deadline constraints.Finally,the performance of the proposed algorithm and the Earliest Deadline First Scheduling algorithm(EDFS),and the Improved Earliest Deadline First Scheduling algorithm(IEDFS)under different parameters are compared by simulation.The results show that the proposed SRS algorithm is superior to the other two algorithms in terms of task completion time under different problem scales.Its advantages become more obvious when there are more delay-sensitive tasks. |