With the development of 5G technology and artificial intelligence,various emerging applications in the Internet of things(IoT)have produced a large number of computing tasks requiring high quality and low delay,which is far beyond the capacity of existing terminal devices and cloud computing.As an edge computing architecture to solve the problem of insufficient computing power,fog computing(FG)is highly expected by more and more people.It is considered that the introduction of fog computing in the Internet of things can effectively reduce the task computing delay and ease the terminal computing pressure.However,how to allocate edge computing resources to a large number of terminals in the Internet of things will directly determine the task quality of service.At this time,reinforcement learning(RL)training unloading strategy can be introduced to optimize the resource allocation.At the same time,the massive data generated in the Internet of things will greatly occupy limited communication resources and space resources during transmission and storage,which will also affect the quality of service.As a fast data compression algorithm,compressed sensing(CS)can process massive data in the Internet of things,further improve system space utilization,shorten task transmission delay and improve service quality.In the Internet of things scenario,this paper studies how to effectively improve the computing power of terminal devices,provide users with highquality services,and solve the transmission and storage of massive data in the network.The main innovations and work are as follows:(1)In the scenario of large-scale computing tasks of massive devices in the Internet of things,a computing offload model is proposed.For computing intensive applications,a distributed task offload model based on fog computing is proposed.When the computing task is unloaded to the main fog node,it is split and assigned to the fog node in the same area for auxiliary computing.Taking advantage of its short transmission delay and high computing power,it can improve the quality of service of the computing task and reduce the delay.Simulation analysis shows that the proposed distributed task unloading model based on fog computing can solve the problem of insufficient computing power at the terminal.(2)In the scenario of large-scale computing tasks of massive devices in the Internet of things,a computing offload strategy is proposed.Aiming at the scenario of task high-quality computing in the mass devices of the Internet of things,based on the computing offload model proposed in this paper,an optimization strategy of Internet of things resource allocation using machine learning is proposed.The unloading task is transformed into a Markov decision process(MDP).Reinforcement learning in machine learning has the ability to solve sequential decisions,optimize resource allocation,reduce system unloading delay and improve service quality.Simulation results show that the proposed algorithm has good convergence and significantly reduces the task delay.(3)In the scenario of massive data transmission and storage in the Internet of things,a data processing model is proposed.A data processing model based on compressed sensing is proposed to deal with the excessive occupation of communication resources and space resources by terminal equipment data collection and massive data.Time domain compression and sub sampling preprocessing are performed on the terminal equipment to improve the utilization of communication resources and reduce the transmission delay by compressing the data size.After the compressed data is unloaded to the fog node,it is in the stored or calculated state.Storage in compressed state can improve the space utilization of the server.The direct computing task needs to reconstruct the data,and can recover the data quickly by using the characteristics of compressed sensing,so as to reduce the processing delay of the whole task.Simulation analysis shows that the proposed data processing model based on compressed sensing can solve the problem of communication resources and space resources shortage in massive data scenarios. |