In Io V(Internet of Vehicles)systems,due to the limited computing capacity of vehicles to perform complex computing tasks,the most direct way is transferring the data to cloud data center for computing.Cloud data centers are far from vehicles,which can lead to network congestion,time-consuming,and data privacy leaks.In order to solve the above problems,the edge computing model has been greatly developed.Offloading tasks to edge servers close to the vehicles is a practical approach.In addition,vehicles need to request multiple tasks during the movement,which should collaborated with each other to complete a complex business goal.For example,when the vehicles are looking for a parking lot,it need to access the remaining parking space information and corresponding road condition information of the nearest parking lot.Therefore,the association between subtasks when the task is unloaded is an important issue that cannot be ignored.Therefore,we focuses on the relationship between tasks and studies the problem of task offloading in the Internet of Vehicles environment,including the following:1)We propose a multi-task collaborative offloading method.This method directly mines the discriminated interacting task pairs through a frequent pattern based mining algorithm,and then the task offloading delay model is constructed not only considering the task request delay under the three cases,which are local computing,total offloading,and partial offloading,but also considering the data communication delay between the interacting subtasks.Finally,a task collaborative offloading algorithm based on deep reinforcement learning is presented to offload the interacting tasks collaboratively.2)In the dynamic and complex environment of vehicular network,if multiple vehicles offload tasks to the same edge server,resource competition may greatly increase the service delay.The position of the vehicle changes over time.To solve this problem,we design a offload algorithm named CSO-DRL to search the optimal offloading.Therefore,we construct the offloading delay model in the resource competition environment by considering the vehicle movement trajectory and edge server computing resources,,and then the model is trained by the deep reinforcement learning algorithm to obtain the optimal offloading strategy.Finally,the efficiency of the algorithm is verified by experimental comparison.3)In view of the unbalance of computation resources at the edge servers,we consider the collaboration of multi edge servers for task offloading.When multiple vehicles compete the computation resources,we schedule the tasks between the multi edge servers to realize the balance of computation resource.On this basis,we construct an offloading decision model and conduct simulation experiments by real-world dataset.The optimal offloading strategy is obtained by training the deep reinforcement learning algorithm,and then the experimental results show that the algorithm can reduce the offloading delay effectively,so that the edge layer can provide low-delay services for users and reduce resource consumption and the pressure of remote cloud centers. |