Font Size: a A A

Research On Image Fusion Using Multiscale Empirical Mode Decomposition

Posted on:2010-03-02Degree:DoctorType:Dissertation
Country:ChinaCandidate:Y Z ZhengFull Text:PDF
GTID:1118360308457496Subject:Computer Science and Technology
Abstract/Summary:PDF Full Text Request
With the increasing applications of multi-image sensors in a wide range of areas, such as military domain and civil domain, image fusion has become a more and more important issue of combing multiple source images into a single one. The factors of multi sensor modalities, plenty of image data, and complex features of image, have made vital challenges in image fusion techniques. This paper gives an intensive study on the multiscale decomposition, the synthesis algorithm of multiscale representations, and the quality evaluation metrics of fused images. The work is summarized as follows:Firstly, adaptive coordinate empirical mode decomposition (AC-EMD) is proposed to solve the poor performance adaptivity of multiscale decompositions (MSDs) in image fusion algorithms. AC-EMD is a fully data-driven multiscale decomposition which self-adaptively and coordinately decomposes the source images into a number of"well-behaved"intrinsic mode functions (IMFs) as well as a residual image. The representation of AC-EMD has better physical features of images than that of pyramid and wavelet decompositions.Secondly, two types of pyramidal empirical mode decomposition are proposed. One is pyramid empirical mode decomposition (PEMD). PEMD transform is less redundant, and combines the merits of the Laplacian pyramid and the properties of AC-EMD. The other is a hybrid representation of empirical mode decomposition (EMD) and contourlet transform (CT), named the EMD-CT. EMD-CT transform shares high adaptivity of AC-EMD, data structure of pyramidal transform, while owning multidirection analysis of contourlet transform. The proposed EMD-CT has not only reduced the redundancy of AC-EMD, but also achieved directional representation of the source images. Thirdly, a fusion rule based on the combination of PCA and consistency checking is proposed to overcome the uneven of the source image information of multisensors and incontinuity of fused image in synthesizing multiscale representations. To overcome the inconsistency and inconsequence problem of the fused image completely, we also propose a region-based fusion rule for synthesizing multiscale representations, using the regional properties of target image. The former fusion rule has lower computational complexity, while the latter achieves the better quality of the fused image.At last, two objective image quality metrics without referenced image are proposed: the image quality metric based on the Renyi entropy and the image quality metric using the structural similarity. The former metric measures the total amount of information that fused image contains about source images based on the merit of Renyi entropy, which avoids the overlapping problem of mutual information. The latter metric considers not only the similarity between the source images and the fused image, but also the similarity among the source images.
Keywords/Search Tags:Image fusion, Multiscale decomposition, Empirical mode decomposition (EMD), Fusion rules, Quality evaluation
PDF Full Text Request
Related items