As the speed of remote sensing applications continues to increase,integrating SAR and optical and other multi-source remote sensing images in complex scenes has been a trend.Image fusion can extract rich time-space-spectral information from SAR and optical images to synthesize new high-quality images.However,the imaging mechanisms of SAR and optical sensors are different,there are significant nonlinear radiometric differences between images,and SAR images contain much noise,resulting in spectral and spatial information distortion in the fusion results of SAR and optical images,which is challenging to meet the real needs of applications.Therefore,research on high-quality fusion algorithms of SAR and optical remote sensing images is still a hot and challenging problem in current research.The image element activity metric operator and injection weight calculation method in the existing fusion algorithm are less robust to spectral differences and noise,which leads to the inability to extract the complete spectral and spatial information and makes it difficult to achieve high-quality fusion of SAR and optical images.Given this,this thesis investigates the fusion methods of SAR and optical remote sensing images based on component replacement and multi-scale features.It improves the image element activity measurement and injection weight calculation method.The results obtained from the research are as follows:(1)Propose a fusion algorithm based on non-local self-similarity and divergence informationThe thesis proposes a fusion algorithm based on non-local self-similarity and divergence information to address the problem that the traditional activity level measurement fails to consider the influence of structurally similar image elements on the central image element,resulting in severe spectral and spatial information distortion.The algorithm builds the basic fusion framework based on hyperspherical color space and non-downsampled contour waves.When fusing low-frequency coefficients,the algorithm fully considers the influence between structurally similar image elements,improves the directional entropy of the image by using non-local self-similarity,and constructs a "non-local directional entropy maximum" fusion rule.The algorithm constructs a "’maximum divergence’" fusion rule based on divergence information when fusing high-frequency coefficients.Experimental analysis shows that the proposed fusion algorithm can effectively reduce the global brightness,contrast variation,and spectral and spatial information distortion at texture edges.(2)Propose a fusion algorithm based on gain injection and guided filteringThe thesis proposes a fusion algorithm based on gain injection and guided filter to address the problems that the weighted average fusion method may lead to spectral distortion and divergence feature’s weak noise immunity may lead to incomplete spatial information.The algorithm builds the basic framework based on generalized luminance-hue-saturation variation and non-subsampling counterlet transform.The algorithm constructs a " gain injection of divergence features" fusion rule in the low-frequency fusion,which only injects the specific features of the SAR image to reduce further the spectral information distortion caused by the spectral differences.The algorithm uses the guided filter to optimize the texture details of weight maps in the "maximum divergence" fusion rule in high-frequency fusion to reduce spatial information distortion.The experimental analysis shows that the proposed fusion algorithm is more robust to noise and spectral differences and can effectively reduce the global spectral distortion and spatial information loss in the fusion results.(3)Propose a fusion algorithm based on an improved impulse-coupled neural network and phase congruency informationThe thesis proposes a fusion algorithm based on an improved pulse-coupled neural network and phase congruency information to address the problem of high parameter setting the difficulty of the pulse-coupled neural network,which makes it challenging to evaluate the activity level of image elements accurately and causes a reduction in fusion quality.The algorithm builds the basic framework based on generalized luminance-hue-saturation variation and non-subsampling counterlet transform.The algorithm constructs a "gain injection of phase congruency" fusion rule in low-frequency fusion to further improve the spectral information retention capability of the fusion algorithm based on the maximum moment information of phase congruency.The algorithm constructs an " improved pulsecoupled neural network based on phase congruency" fusion rule in high-frequency fusion.It improves the spatial information retention ability by optimizing the model parameters of the pulse-coupled neural network.The experimental analysis shows that the proposed fusion method can overcome the influence of noise and significant nonlinear radiometric differences between images on the fusion quality and has high quality and stable fusion performance.In summary,the thesis proposes three algorithms based on component replacement and multi-scale features around the fusion of SAR and optical remote sensing images.All three fusion algorithms can effectively fuse the precise and complete spatial information in SAR images with the rich spectral information in optical images to generate high-quality fused images.Moreover,the fusion performance of the three fusion algorithms is enhanced sequentially,which can effectively overcome the effects of noise and spectral differences,to reduce the spectral and spatial information distortion in the fused images,with high-quality fusion results and stable fusion performance. |