Font Size: a A A

Research On Cache Strategy And Resource Allocation Based On Mobile Edge Computing

Posted on:2024-05-03Degree:MasterType:Thesis
Country:ChinaCandidate:N X LiFull Text:PDF
GTID:2568307058482394Subject:Master of Electronic Information (Professional Degree)
Abstract/Summary:PDF Full Text Request
With the development of emerging technologies such as mobile network,5G communication technology and Internet of Things,various new mobile device applications are increasingly favored by users,such as computer vision,virtual/augmented reality,unmanned driving,face recognition,cloud games and unmanned aerial vehicle technology,which increases the demand of users for these applications.However,due to the limitations of mobile devices,such as battery capacity,computing power and storage capacity,it is difficult to run tasks on mobile devices to meet these new demands,and cloud computing came into being.In cloud computing,when users use mobile devices,they can offload the computing-intensive application tasks of mobile devices to the cloud server and use the computing power of the cloud server.However,since the physical distance of the remote cloud server is generally far away from the mobile users,it is impossible to avoid the communication delay caused by the transmission,so it cannot meet the requirements of many time-delay sensitive applications,so neither traditional local computing nor cloud computing can meet the requirements of many time-delay sensitive applications.In order to meet this challenge,mobile edge computing is proposed.By arranging the edge server at the edge of the network,because the physical distance between the edge server and the user is close,the user can offload the tasks of the mobile device to the edge server,which can not only meet the user’s task calculation requirements,but also meet the time delay-sensitive task requirements because of reducing the transmission delay.The edge server not only provides computing resources,but also provides storage resources,so the edge server can be used as a cache node to store the content requested by users.When mobile users request content,cloud computing can only obtain content from remote cloud servers,which will not only lead to long service delay,but also further lead to congestion of the core network.Enable caching on the edge server to store popular content,so that these contents can be transmitted directly from the cache in the edge server instead of from the remote cloud server.Therefore,the traffic load in the backhaul link and the congestion of the core network can be greatly reduced.Therefore,considering caching strategy in mobile edge computing can improve the performance of the whole system.In this thesis,firstly,the content cache placement strategy,resolution and power control of mobile augmented reality devices in the mobile edge computing scene are studied.On this basis,the problem of service cache placement and task offloading in mobile edge computing is studied.The main work of this thesis is as follows:(1)In the mobile edge computing scenario,the limited computing resources of mobile augmented reality devices are the challenge of mobile augmented reality tasks.Caching object detection information can reduce the power consumption and time of mobile devices,which is an effective solution to deal with mobile augmented reality tasks.Because the storage resources of mobile augmented reality devices are insufficient,only a small amount of object detection information can be cached at the same time,so mobile augmented reality devices must wisely decide which object detection information to cache to maximize the performance of edge computing.At the same time,how to set the resolution and power of mobile augmented reality equipment suitable for users also has an important impact on the processing efficiency of mobile augmented reality tasks.This thesis studies the joint optimization of cache decision,resolution and power control of mobile augmented reality equipment.The joint optimization problem is expressed as minimizing delay and energy consumption.In order to solve this problem,a cache decision,resolution and power adjustment strategy for edge computing system are proposed.This thesis also proposes an improved deep Q-network algorithm to learn this strategy.The simulation results show that this algorithm is superior to the traditional cache decision algorithm.(2)In the mobile edge computing scenario,due to the heterogeneity of tasks,different application services are needed to perform each task.Caching application services and related data in edge servers is challenging.Therefore,this thesis studies the service cache placement and task offloading in the Internet of Things network.Because Io T devices and edge servers with limited storage resources can only cache a few services at the same time,this thesis formulates the problem of service cache placement and task offloading of Io T devices to minimize the task service delay with long-term energy constraints of Io T devices,which is a mixed integer nonlinear programming problem.In order to solve this problem,an online deep reinforcement learning based on Lyapunov optimization framework algorithm is proposed.In this thesis,a virtual queue model is first established,and the problem is decoupled by Lyapunov optimization technology,and the problem is transformed into a single slot optimization problem.Then,this thesis uses deep reinforcement learning technology to find the optimal edge service caching and task offloading strategy for each slot.The simulation results show that this algorithm is superior to other benchmark algorithms and can significantly reduce the service delay.
Keywords/Search Tags:mobile edge computing, service caching, task offloading, Lyapunov optimization
PDF Full Text Request
Related items