Font Size: a A A

Fog Computing Container Integration Based On Deep Reinforcement Learning

Posted on:2024-04-12Degree:MasterType:Thesis
Country:ChinaCandidate:J WangFull Text:PDF
GTID:2568307094981259Subject:Management Science and Engineering
Abstract/Summary:PDF Full Text Request
With the rapid development of cloud computing and wireless sensor technology,big data applications supported by the Internet of Things have spread throughout various fields such as industry,medicine,and military.Since the traditional single cloud computing model is difficult to meet the high demand of geographically distributed Io T devices for low data processing latency,high bandwidth,and real-time decision-making,fog computing came into being,which reduces the amount of network data transmission and reduces latency by configuring computing and storage capabilities at the edge of the network for timely processing of latency-sensitive tasks.As the core infrastructure of fog computing,the scale of China’s data centers has grown exponentially with the generation of massive data in recent years,becoming a major energy consumer and carbon emitter.Therefore,this paper proposes a fog computing container integration method based on deep reinforcement learning(DR-CC),aiming to achieve the lowest response time and lowest energy consumption.The main research work and innovation points of this paper are as follows:(1)Through comprehensive consideration of fog computing resource deployment,task response time,computing resource utilization,etc.,the container is used as a computing resource unit to provide services for terminal equipment.Compared with virtual machines,containers have the advantages of easy deployment and high performance.With the popularity and development of container technology,containers gradually replace virtual machines as the main virtual technology for fog computing resource scheduling,but most of the current task scheduling algorithms are still based on virtual machines,therefore,this paper uses containers as the unit of resource scheduling to study fog computing resource integration,and uses checkpoint/restore in userspace(CRIU)to migrate containers in real time to achieve resource integration.This in turn improves the overall resource utilization of the system.(2)Aiming at the dynamics of fog computing tasks,high energy consumption in data centers,and low latency requirements of tasks,dynamic task load models,energy consumption models and delay models are constructed respectively.By making the fog computing resource scheduling process concrete and visualized,the energy consumption of the fog computing system and the quality of service provided by users can be more accurately measured.(3)Based on the fog computing system architecture,aiming at the high energy consumption of data centers,the random dynamics of application task load and the low latency requirements of users for applications,a container integration method based on A2C(Advantage Actor-Critic)algorithm is proposed,which uses neural network approximators to minimize energy consumption and average response time of data center infrastructure for accurate modeling.Based on the A2 C algorithm,an end-to-end decision-making model from fog computing system state to container integration strategy is constructed,and an adaptive multi-objective reward function is proposed,and the agent generates an optimal scheduling strategy through real-time interaction and learning with the environment,and the policy gradient learning algorithm of backpropagation is used to accelerate the convergence speed of the decision-making model.A large number of simulation experiments on the python platform using the real-world Bitbrain dataset are used to show that the DR-CC method can effectively reduce energy consumption while ensuring service quality.Compared to the baseline method,the DR-CC method can reduce energy consumption by12.7% and response time by 7.5%.
Keywords/Search Tags:Fog computing, Resource scheduling, Deep reinforcement learning, Container technology, Modeling and simulation
PDF Full Text Request
Related items