Font Size: a A A

Research On Arithmetic Coder Based On Parallel Context Model

Posted on:2012-08-24Degree:MasterType:Thesis
Country:ChinaCandidate:M GaoFull Text:PDF
GTID:2218330362450412Subject:Computer applications
Abstract/Summary:PDF Full Text Request
The entropy coder is one of the key technology used in the video coding technology.To improve the performance of the entropy coder,the context based entropy coder had been proposed. In general,the entropy coder includes variable length coder and arithmetic coder.The advantage of the variable length coder is that it has low computation,but the length of word of the variale length coder is integeral,and it can't track the time variability of the true statistics of the process.Therefore,the performance of the variable length coder is very low.Arithmetic coder is the one that can map some characters or the whole sequence of the characters to one code word,so in average it can assign a character a code word whose length is less than 1.Moreover,the arithmetic coder can adaptively change the probability of the character according to the local statistics of the characters.So the average code length of arithmetic coder is more close to the limit entropy and we could achieve a better comprssion performance.The only drawback of the arithmetic coder is that it has very high computation,because the principle of the arithmetic coder is based on the iteratin of the coding interval.To reduce the computation of the arithmetic coder,some approximate approach is proposed.The reprsentation is context-adaptive binary arithmetic coder adopted by H.264/AVC,which is one of the best arimetic coders now.With the appearance of the HD video and UHD video,the compression performance is not the only one criteria,and the data throughput should be taken account for,when evaluating the entropy coder.The serial nature of the CABAC make the CABAC engin run at extremely high frequency to decode the high bit rate video-stream,which will consume more power and sometimes is unfeasible.To improve the data throughput of CABAC,a lot of parallel techniques are proposed.These parallel techniques can accelerate the speed of different modules in CABAC,and they should be combined to improve the datathroughput of whole system.From our analysis of CABAC,we find that the process of the context model is the bottleneck of the whole CABAC system.We propose a parallel context model to improve the data throughput of CABAC further.The proposed context model use the number of the non-zero coefficients and the scanned position of the non-zeros coefficient in a DCT block as the context,which breaks the context dependency between the different coefficients,so the context modeling can be processed in parallel with the arithmetic process,which seems that the context modeling process doesn't exist.The context modeling process can be combined with the existing parallel techniques to improve the data throughput of CABAC greatly,because the context modeling of the different coefficients can be implemented in parallel.The experimental results show that the proposed context model can achieve the similar compreesion performance as the CABAC and the bit rate is reduced 0.029% on the test sequences.
Keywords/Search Tags:CABAC, parallel context model, level information
PDF Full Text Request
Related items