Font Size: a A A

Bandwidth-Aware Cache Modeling And Partitioning Strategy Based-on Video Scenarios

Posted on:2014-07-15Degree:MasterType:Thesis
Country:ChinaCandidate:L J WangFull Text:PDF
GTID:2268330422463495Subject:Computer system architecture
Abstract/Summary:PDF Full Text Request
Multimedia stream computing, such as video encoding and decoding, is not only withhigh computational complexity, but also with intensive data access in computing process.Multi-level Cache architecture could diminish the speed-gap between processor andmemory. But the memory space of on-chip Cache is very limited usually, and the energyconsumption is high relatively. So, allocating the Cache memory according to the run-timerequirement features of multimedia stream computing is necessary. It could make full useof Cache, ensuring the performance and decreasing energy consumption of system.Cache modeling and partitioning strategy are investigated based on video scenariosfor video decoding, which is a typical multimedia stream computing application. Thebandwidth-aware Cache modeling and partitioning strategy are proposed based on videoscenarios in this paper. It is designed in the premise of guaranteeing the soft real-time ofvideo decoding, combing two features of video decoding. The one is the data volumeshould be accessed during video decoding is different while video scenario is changed.The other one is the energy consumption is different when decoding under theconfiguration of different Cache size. Firstly, after analyzing the variation characteristicsof performance and energy consumption when decoding different video sequences underthe configuration of different Cache size, the Cache model which makes decoding meetthe delay requirements and reduces the energy consumption is established. Then, the bestCache sizes of multigroup video sequences are calculated through the Cache model. TheseCache sizes are divided into several groups through a simple clustering algorithm and thecharacteristics of the corresponding video scenarios are analyzed in each Cache group. Toperceive video scenarios switch from the complexity of video motion and frame residual,and then, ascertain the mapping relationship between the two factors and Cache demand indecoding. Thereby, the mapping relationship can be used to achieve the goal of allocating Cache size in decoding process dynamically. This Cache partitioning strategy couldimprove the utilization of Cache. In addition, under the conditions of guaranteeing the softreal-time of video decoding, the suitable Cache size could gain lower energy consumptionof the system.Based on the TMSC6416platform of TI Company, with H.264video decoder assoftware platform, various standard video sequences of CIF format are tested. To analyzethe performance both in Level2Cache hit rate and energy consumption. The experimentresults show, in the premise of guaranteeing of the soft real-time of video decoding, theenergy consumption reduction of C6416Cache on-chip is30.37%and the L2Cache hitrate decreases3.38%on average after optimization. It can be safely concluded that theCache partitioning strategy based on video scenarios could guarantee the performance indecoding and reduce energy consumption of the system.
Keywords/Search Tags:Stream computing, video decoding, video scenarios, cache modeling andpartitioning, energy-efficient
PDF Full Text Request
Related items