Font Size: a A A

Key Techniques Of Distributed Cache At The Edge Network For Streaming Media Transmission

Posted on:2013-10-25Degree:MasterType:Thesis
Country:ChinaCandidate:B HuangFull Text:PDF
GTID:2268330422474309Subject:Computer Science and Technology
Abstract/Summary:PDF Full Text Request
The Internet users’ demand for streaming media business is growing rapidly. Bythe end of June2012, the number of Internet users reached538million in China. Theonline user has been increased by about25million. This makes a big challenge tostreaming media business. Large-scale streaming media applications propose highdemands on network transmission capacity, network transmission protection and thecapacity of server. However, there are often a series of problems in the exiting networktransmission, such as delay, jitter, packet loss and so on. The transmission quality cannot satisfy the rapid growth of streaming media applications. The poor performance ofstreaming media transmission also reduced the user’s experience. This requires thecontent providers to provide more powerful streaming service equipment, networkservice providers to provide a more stable and efficient transmission network streaming.Network transmission capacity and the server capacity limited to large-scale applicationof steaming media. In order to solve these problems, we need do more researches onnetwork distribution, server, to compression encoding. The solutions as Peer to Peer(P2P) and Content Distribution Networkong (CDN) are exposed some shortages underthe test of practice. In many methods which ease data transmission pressure of the corenetwork and accelerate the acquisition of users getting data, cache data at the edge ofthe network is a feasible, effective and respected measure.With the performance enhancement of network switching equipment, the peoplebegan to jump out of the mindset of the traditional end-to-end, to explore the controland optimization of the transmission using network interconnection devices. NamedData Networking (NDN) is a strong support for this idea. The emergence ofprogrammable network interconnection equipment, like NetMagic, further affirmednetwork interconnection devices are able to play the role in the area of contentdistribution. RAMCloud supposed to use DRAM as data storage medium completely,and open up a new approach to solve data access latency problem. The same time, therapid development of cloud storage technology in recent years, also provide a valuablereference for caching data in distributed way. Key-Value data storage mode, likeGoogle’s Bigtable, Amazon’s Dynamo and Taobao’s Tair, has been widely used for itsexcellent scalability. The data models of these systems provides a good reference andcompare ideas for caching data at the edge of network in distributed way.The research of caching techniques at the edge of network also has some shortagesuch as bottlenecks of the cache server, low service efficiency by single cache server,the service system is only designed for some particular network and special applications.Therefore, our research is focus on aggregation the user memory resources at the edgeof network, cache service capacity expansion, data acquisition accelerated. Distributed cache data management has conducted in-depth research. The main work andinnovation, including:(1)We proposed that using free memory resources of end host at the edge networkas cache resources to construct distributed caching infrastructure services architecturenamed BufferBank. And three mapping model refinement and described caching systemat the edge of network.(2)By using three mapping model we design a distributed data cache structure.This structure can be a very good management of distributed cache data. And it has ahigh scalability and a certain degree of reliability. Meanwhile, the structure having ahigh rate of data query positioning. And a multi-cache the end cache data to speed upclient data acquisition, and further improve the user experience.(3) Through experiments and statistical analysis of the data read performance andbandwidth requirements, under the existing conditions. BufferBank this cachearchitecture is feasible, but also provide a shorter presentation than traditional diskgreater bandwidth, higher efficiency of the service.In summary, carried out in-depth research on the edge of the network distributedcache infrastructure, proposed three mapping model. And distributed cache datamanagement structure and algorithm are designed in the model. The edge of thelarge-scale streaming media distribution networks cache and distributed datamanagement has important reference value.
Keywords/Search Tags:The edge of the network, Streaming Media Cache, DistributedResource Aggregation, Memory storage
PDF Full Text Request
Related items