Font Size: a A A

Research On Edge Caching Strategy Based On User Long And Short Term Interest

Posted on:2024-04-04Degree:MasterType:Thesis
Country:ChinaCandidate:M Y SiFull Text:PDF
GTID:2568307079966369Subject:Electronic information
Abstract/Summary:PDF Full Text Request
With the development of mobile communication technology,the rapidly growing mobile data traffic puts a huge burden on the base station and core network,and the traditional network architecture can hardly meet the user’s demand for low-latency access.Edge caching technology is considered as an effective solution to avoid network congestion and reduce the average content delivery latency by deploying caching servers near users.Since resources are limited at the edge of the network,research on edge caching techniques has focused on how to utilize limited resources to better serve users.In this thesis,we conduct a series of studies of the edge caching problem from two perspectives:cache content selection,and cache content placement.The main work of this thesis is as follows:(1)Aiming at the problem of content selection,this thesis proposes a Long and Short Term Attention Network(LSTAN)recommendation model from the perspective of content recommendation,which models user’s interests from the perspectives of long-term and short-term interest,and more accurately predict the content that users are most likely to request in the future.The long-term interest of user is obtained by weighting the historical sequence of user using attention network.For the short-term interest of user,the sequence information in the short-term behavior is extracted using the Gated Recurrent Unit(GRU)fused with the attention,and the multi-headed self-attentive network is used to capture multiple points of interest of the user at the same time.Finally,the long-term interests of user,short-term interests of user and user features are fed into the fully connected network for training to obtain the long-and short-term interest fusion vector.In this thesis,we perform comparative simulations on two datasets,Movie Lens and Amazon.The simulation results show that the recommendation performance of the proposed LSTAN model is improved by 14.9% compared with the base model,which can improve the accuracy of cached content selection.(2)Aiming at the problem of content placement in the edge server collaboration scenario,this thesis proposes an adaptive redundancy caching strategy based on greedy algorithm starting from the cache redundancy.The strategy achieves adaptive adjustment of cache redundancy by making predictions on the caching gain of content and place content on one or more edge servers.The cache space of the edge servers is divided into two parts,and the placement is decided according to the distribution of content requests when cache replacement is performed.In addition,this thesis analyzes the limitations of cache hit rate as metric for edge caching strategies and validates the proposed caching strategy with a comparative simulation using the average response latency of content as an evaluation metric.The simulation results show that the proposed caching strategy can adaptively adjust the redundancy of content under the cache collaboration domain according to the cache resources,and reduce average response latency by 38% compared to region popularity-based caching algorithms.
Keywords/Search Tags:Edge Caching, Recommendation Model, Attention Mechanism, Cache Placement
PDF Full Text Request
Related items