Font Size: a A A

Image spatial entropy based on quadrilateral Markov random field

Posted on:2010-08-17Degree:Ph.DType:Dissertation
University:The University of Texas at DallasCandidate:Rahimshoar Razlighi, QolamrezaFull Text:PDF
GTID:1448390002488837Subject:Engineering
Abstract/Summary:
The use of information theory measures such as entropy and mutual information has been steadily growing in image processing applications. Shannon entropy is a powerful tool in image analysis, but its reliable computation from image data faces an inherent dimensionality problem that calls for a low-dimensional and closed form model for the pixel value distributions. The most promising such models are Markovian, however, the conventional Markov Random Field is hampered by non-causality and its causal versions are also not free of difficulties. For example, the Markov Mesh Random Field has its own limitations due to the strong diagonal dependency in its local neighboring system. This is due to the fact that all the existing causal Markov Random Fields are defined to be unilateral with respect to only one corner of the image, while the image lattice by nature has four corners and consequently it is quadrilateral. In this dissertation, a quadrilateral MRF is thus developed to resolve the issue of directionally bias. It is shown that such a model allows the image lattice pdf to be written as the product of a large number of two-dimensional pdfs. By making the homogeneity assumption, it is made possible to estimate the two-dimensional neighboring process pdfs via their joint histogram from a given image. Furthermore, the dimensionless characteristic of Shannon entropy and its logarithmic nature make it possible to convert the product of two-dimensional pdfs into a summation of simple dimensionless numbers toward obtaining image spatial entropy. Computational complexity of the new computational method of image spatial entropy is consequently higher than the complexity of the classical entropy. An estimation method is also developed in this dissertation to reduce the computational complexity of image spatial entropy. It is shown that for applications tolerable to slight inaccuracies image spatial entropy can be computed with the same complexity as the classical one. The same computation approach is also utilized to compute the image spatial mutual information. In addition, based on this mutual information, two new similarity measures for image registration are introduced. These measures are compared to the existing similarity measures within the context of medical image registration. The results obtained show the superiority of the introduced similarity measures to those of the existing ones.
Keywords/Search Tags:Entropy, Markov random field, Measures, Mutual information, Quadrilateral
Related items