Font Size: a A A

Lossless Image Compression Technology Based On Parallel Processing

Posted on:2009-08-01Degree:MasterType:Thesis
Country:ChinaCandidate:J L ZhangFull Text:PDF
GTID:2178360242494598Subject:Computer software and theory
Abstract/Summary:PDF Full Text Request
With the development of computer science and technology, its application area has become increasingly widespread. In image processing field, because the number of data is very huge, how to improve the processing velocity is one of the most important challenges. That is all because of the characteristic of image data and the complexity of image processing algorithm. It is difficult to process the multimedia information with a minimum of time overhead and space overhead transmission in image processing. And this is just the key point in image processing. Improving communication bandwidth can accelerate the data transmission while raising the storage can provide greater storage capacity. However, these can not truly meet the needs of practical applications. This requires an image compression algorithm for high compression ratio. But most of the image compression algorithms have the features including complicated operations, large computing works, huge quantity of data and standardized types of computing. Parallel computing is one of the effective instruments in improving processing velocity. Along with the development of high-powered parallel processing system, image parallel processing technology will provide greater development space for improving the velocity of image processing. Therefore, while getting the high compression ratio in the image signal, we must also find a suitable image compression parallel algorithm to improve the computing speed and meet the real-time requirement for a variety of systems.The development of the hardware technology in recent decades brought rapid improvement in CPU processing speed, but the processing speed requirements still can not be satisfied in many areas of advanced applications. The contradiction between the finiteness of single computer's technology development and the infiniteness of the requirement for science computing determines that multi-computer's parallel will be the development of computer. We need the quick and effective parallel computers to achieve large-scale scientific computing and data processing in many fields, such as energy, meteorology, military, medical, artificial intelligence and some basic researches. Therefore, the demand on the parallel processing has greatly promoted the development of parallel technology.With the image compression and parallel computing as the main study, we focus on the parallel algorithm of Huffman encoding. First we outlined the basic knowledge of image processing, detailed images parallel processing technology and the relative theory of image compression methods. Secondly, we introduced the parallel computing architecture and model as well as the determined factors of its performance, and focused on the parallel algorithm programming environment. We showed the effectiveness of the parallel algorithm through computing the value of pi on the MPI platform.The main research work: With the analysis of the serial Huffman coding algorithm we proved the feasibility of the parallel Huffman coding. Furthermore the parallel Huffman coding algorithm was derived based on the method of reallocation coding. The algorithm can be applied in coding and decoding. Finally, the analysis of the time complexity shows that the algorithm has a higher parallel efficiency.
Keywords/Search Tags:Parallel Algorithm, image processing, lossless compression, Huffman coding
PDF Full Text Request
Related items