| Recently,the entertainment demands on the Internet of Vehicles(IoV)have grown explosively,and various vehicle-based applications have been developed.Therefore,in the public transportation system,entertainment ways such as videos and games are gaining popularity among passengers and become the main part for the increasing of communication traffic.However,the cloud server has limited backhaul link capacity,which will lead to an increased delay in obtaining video content.Introducing edge caching technology into the IoV is considered as one of the solutions,which can effectively guarantee the quality of experience(QoE)in the public transportation system.Although edge caching technology is considered as one of the potential solutions,it also has some problems.For example,due to the high mobility of passenger terminals in public vehicles,it is hard for the fixed Road Side Units(RSUs)or other mobile vehicles to establish a stable communication connection with them.Therefore,some researchers propose to equip public vehicles with Mobile Public Server(MPS)to provide content services for passengers.MPS has a close physical distance with passengers in public vehicles,which can reduce content transmission delay.However,as there are numerous content demands of passengers in public vehicles,how to use limited storage and communication resources to cache and distribute content becomes two important problems.Thus,this thesis concentrates on improving the resource efficiency in the public transportation system,and the main works are shown as follows:To tackle the problem of limited storage resources in public vehicles,a QoE-based video hierarchical caching strategy is proposed.For caching more kinds of videos in MPS,MPS and RSU are designed to cooperatively transmit the content,where MPS provides the beginning segments of a video and RSU provides the remaining segments for passengers.To evaluate the user experience more accurately,QoE hit rate is defined,which represents the probability that the bus and RSUs jointly provide passengers with desirable video segments successfully.Furthermore,since the change of passenger flow will lead to different preferences of video,the deep reinforcement learning algorithm is adopted to form a caching update strategy to adjust the proportion of video segments cached by the MPS,so that more passengers will derive the required video in the beginning,leading to maximizing the QoE hit rate.Simulation results demonstrate that this strategy is superior to other comparison methods in QoE hit rate and caching costs.To tackle the insufficient communication resources problem in public vehicles,an adaptive content delivery strategy based on two-layer collaboration is proposed.Specifically,the MPS layer and the device-to-device(D2D)layer composed by passenger terminals are used to provide content delivery services for passengers in the public vehicle.In the MPS layer,considering that MPS may receive numerous content requests in a short time,to speed up the processing efficiency of requests,a deep reinforcement learning network is trained to decide the processing priority of the request considering the delivery deadline,content size,and payment,under the goal of maximizing MPS revenue.In addition,as MPS only has limited communication resources,a reward model is established in the D2D layer to encourage passengers to participate in content sharing,and the Kuhn-Munkres algorithm is adopted to generate transmission pairs between passenger terminals with the goal of maximizing the revenue of content providers.Simulation results illustrate that this strategy has a shorter content delivery delay and higher content distribution rate when comparing to the first-come-first-served mechanism. |