Font Size: a A A

Research On Pixel Level Fusion Methods For Multifocus Image

Posted on:2013-06-01Degree:DoctorType:Dissertation
Country:ChinaCandidate:H F LiFull Text:PDF
GTID:1228330392453956Subject:Control theory and control engineering
Abstract/Summary:PDF Full Text Request
Due to the limited depth-of-focus of optical lenses, it is difficult to get an imagethat contains all relevant objects in focus. This problem can be solved by the method ofmultifocus image fusion, which enables the objects at different distances to be clearlypresented in an image. It can effectively improve the information utilization of theimage, and enhance the reliability of the target detection and recognition. Moreover, itcan lay a good foundation for the subsequent processing, such as image recognition,edge detection, image segmentation and feature extraction. As an effective tool, themultifocus image fusion technique has been widely applied in many fields, such asmedical imaging, microscopic imaging, military operations, and machine vision and soon.The key task of multifocus image fusion is to find the clear area or pixels from thesource image. Then the fused image that contains all relevant objects in focus can beobtained by composing the clear areas or pixels. In other words, for multifcous imagefusion, the key point is to determine which pixels or regions are located in focus, whichis one of the difficulties in multifocus image fusion. The fusion result is usually notgood due to the complexity of the image content which makes the clarity evaluationdifficult. For the deficiencies of the existing methods, the dissertation, based on theexisting multifocus image fusion methods, studied the topic of multifocus image fusionand two kinds of fusion method in the transform domain and the spatial domain areintroduced, respectively.The traditional image fusion methods based on multiscale transform (MST) aresusceptible to noise. For this deficiency, a new multifocus image fusion method ispresented, which integrates PCNN and multiscale products of lifting stationary wavelettransform (LSWT). In this method, a kind of feature of the multiscale products is usedto motivate the pulse coupled neural network (PCNN) neurons, and the coefficients ofLSWT with large firing times are selected out to compose the fused image coefficients.Due to the fact that the multiscale products of adjacent scales can amplify the importantstructures and dilute noise, the proposed method can reduce the influences of the noise,and obtain a pleasing fusion result.To overcome the deficiency of the traditional contrast measure at judging whetherthe pixels are focused or not, a concept of the feature contrast of orientation information is presented in the paper, and a coefficient selection scheme based on the featurecontrast of orientation information is developed and used to merge the high frequencycoefficients in LSWT domain. In the proposed method, the fact that the human visionsystem is with high sensitive to local contrast is taken into the orientation informationfeature contrast measure. In addition, because the orientation information measure caneffectively distinguish the image edge area, smooth area, and is insensitive to noise, theproposed feature contrast can reduce the influences of the noise and make the proposedalgorithm possess higher fusion performance.The multifocus image fusion methods based on multiscale transform (MST) arelikely to experience the coefficients of the fused images selected mistakenly. For thisproblem, a novel sharpness measure based on fractional differential is introduced, andthen a focused regions detection method is presented based on the sharpness measure.On the basis of the above method, an effective method for focused regions detection isdeveloped and used to design the fusion rules for different subband coefficients. Whenthe pixels of the subband are located in the focused regions, they can be selected out tocompose the fused image. And if the pixels are located in the transition region betweenfocus and defocus, they can be fused according to the new sharpness measure. So, theproposed method can effectively restrain the coefficient selected mistakenly, andimprove the fusion performance.In spatial domain, according to the fact that the MST-based fusion method canavoid the discontinuity in fused image, a novel fusion method is proposed combiningwith focused regions detection and MST. In the method, a useful image fusion schemebased on MST is proposed and used to acquire the initial fused image. The pixels of theoriginal images, which are similar to the corresponding initial fused image pixels, areconsidered to be located in the sharply focused regions. With this method, the initialfocused regions can be determined. Finally, the post-processing is employed to correctthe misclassification for some reasons,and the decision map is produced. According tothe pixel location in the decision map and the initial fused image, the fusion scheme isdesigned and used to obtain the final fused image. The proposed method cansuccessfully avoid the discontinuity in the transition zone between focus and defocus,and improve the fusion performance.Aiming at the deficiency of the traditional multiscale Top-Hat transform whichcannot extract the whole scale information features of the source image, a new modifiedmultiscale Top-Hat transform is introduced. Then, a novel fusion method based on the new multiscale Top-Hat transform and dual window technique is presented. In theproposed method, the dual window technique, which is utilized to determine thetransition zone, is designed to overcome the problem that the size of single window isvery difficult to determine. Due to the technique, the classification error of the transitionzones pixels is reduced, and a good foundation for the subsequent image fusion can beestablished.Finally, a summary of the research contents is presented. Moreover, the furtherresearch object and the target are pointed out.
Keywords/Search Tags:Image Fusion, Lifting Stationary Wavelet Transform, Fusion Rules, FocusedRegions Detection, Multiscle Top-Hat Transform
PDF Full Text Request
Related items