Font Size: a A A

Multi-Condition Context Model Quantization Based On Dynamic Programming

Posted on:2018-02-16Degree:MasterType:Thesis
Country:ChinaCandidate:T MengFull Text:PDF
GTID:2428330518458670Subject:Electronics and Communications Engineering
Abstract/Summary:PDF Full Text Request
According to the entropy encoding theory that conditional entropy is less than the entropy.So the data can be compressed efficiently by using the Context model.However,in practice we found that increasing the number of Context models,the compression performance increases at first and then decreases because the data sampling value is limited.We called this phenomenon as "model dilution".In order to improve "model dilution" problem,this paper proposes the multi-contidition Context model quantization based on dynamic programming.Firstly,building 2 orders and 3 orders Context model to estimate the initial conditional probability distribution.The description length is used to as the evaluation index.Through the method of dynamic programming,the 2 orders and 3 orders Context quantitative model is achieved on a minimum description length of the division of conditional probability method.Then using the cumulative distribution characteristics to achieve a cumulative probability modeling statistics and is used to merge the conditional probability.According to the experimental analysis,the performance of the multi-contidition Context model quantization based on dynamic programming is verified.The experimental result shows that entropy encoding system can significantly improve the performance by the multi-contidition Context model quantization based on dynamic programming.It can also improve the data compression performance.Using the cumulative distribution can effectively reduce the the complexity of the algorithm.
Keywords/Search Tags:Entropy coding, Dynamic programming, Description length, Context model, Context quantization
PDF Full Text Request
Related items