Font Size: a A A

Research On Fog Computing Based Fair And Energy-saving Computation Offloading And Content Service

Posted on:2022-07-01Degree:MasterType:Thesis
Country:ChinaCandidate:Z H YouFull Text:PDF
GTID:2518306557964089Subject:Logistics Engineering
Abstract/Summary:PDF Full Text Request
With the booming development of 5G technology,fog computing has become a promising computing paradigm that can provide powerful storage and computing capabilities for users at the edge of the network,it can effectively reduce the computing burden on Internet of Things(Io T)devices,and thus meets the needs of delay-sensitive and computation-intensive applications better.In order to solve the bottleneck problem of limited resources in Io T device more effectively,the computation offloading technology is gradually becoming a hot research topic in fog computing area.However,the current research of computation offloading and content service for fog-assisted Io T is still immature,and there are several drawbacks in security,fairness and energy efficiency affairs.In order to solve the above problems,this thesis studies fog computing based fair and energy-saving computation offloading and content service scheme with the security guarantee,the main contributions of which include the following three aspects:1)Privacy and Energy Co-aware Data Aggregation Computation Offloading for Fog-assisted Io T Networks: In order to address the bottleneck problem of limited resources in Io T device more efficiently and provide security guarantee in data processing and forwarding process,this thesis proposes a privacy and energy co-aware data aggregation computation offloading mechanism for fog-assisted Io T networks.Specifically,a fog-assisted three-layer security computing architecture is developed to counteract security threats and enable the aggregation operation can be performed in ciphertext.Meanwhile,a momentum gradient descent based energy-efficient offloading decision algorithm is developed to minimize the total energy consumption of computation tasks,which can achieve the optimal value with fast convergence rate.Finally,the security and performance evaluations reveal that the developed data aggregation offloading mechanism is a secure data processing mechanism and achieves significant performance advantage in energy consumption.For example,the total energy consumption can be reduced by an average of 23.1% compared with benchmark performance guaranteed computation offloading(PGCO)solution.2)Fairness and Energy Co-aware Computation Offloading for Fog-assisted Io T: In order to construct a green and long lifetime Io T,this thesis proposes a fairness and energy co-aware computation offloading mechanism for fog-assisted Io T.Specifically,based on the joint consideration of fog node's computing capacity and bandwidth resource and the offloading decision with energy consumption fairness,an optimization problem is formulated to minimize the total energy consumption of all computation tasks.A Momentum Gradient and Coordinate Collaboration Descent based Fair Energy Minimization Algorithm(MGCCD-FEM)is proposed to solve above mixed integer nonlinear programming problem.Namely,first,based on the historical average energy consumption,distance,computing capacity and residual energy of fog node,a fair index is designed to obtain the offloading decision with the optimal energy consumption fairness.Then,based on the obtained optimal offloading decision,the minimization of the total energy consumption for processing all the tasks can be achieved by jointly optimizing the occupation ratios of computing and bandwidth resources with the developed momentum gradient and coordinate collaboration descent method.Finally,the simulation results show that the proposed mechanism can achieve faster convergence speed.Meanwhile,compared with the other two benchmark schemes,the total energy consumption of this mechanism is the lowest,the energy consumption fairness of fog node is the highest,and the network lifetime is enhanced by 23.6% and 31.2% on average,respectively.3)Cache Management and Resource Allocation based Fair and Energy-saving Content Service Mechanism: In order to construct an efficient and long lifetime Io T for content service,this thesis proposes a cache management and resource allocation based fair and energy-saving content service mechanism.Specifically,based on the joint optimization consideration of fog node decision,cloud server service decision,and the bandwidth occupation ratios of fog node and cloud server,an optimization problem is formulated to minimize the total energy consumption in the content service process.Least Recently Used-2(LRU-2)and Nesterov momentum based fair energy minimization algorithm is developed to solve above mixed integer nonlinear programming problem,namely,first of all,a fairness metric is designed based on the historical average energy consumption,the residual energy value and the distance between fog node and the Io T device.According to this metric and fog layer's caching state,the fog node decision with the optimal fairness of energy consumption and cloud server service decision can be obtained.In the case of failed hit in the fog layer,the fog node will cache the feedback content from cloud according to the developed caching strategy in this thesis.Then,based on the optimal fog node decision and cloud server service decision,Newton momentum is employed to jointly optimize the bandwidth ratios of fog node and cloud server and achieve the minimization of the total energy consumption in the content service process.Finally,the simulation results show that the proposed mechanism can gain fast convergence speed and high hit rate.Compared with three other benchmark solutions,the proposed mechanism has the lowest total energy consumption,the highest energy fairness of fog node,and the average network lifetime is improved by 23%,28.7% and 34%,respectively.
Keywords/Search Tags:fog computing, computation offloading, content service, homomorphic encryption, fairness index, cache management
PDF Full Text Request
Related items