| Multi-exposure image fusion(Multi-exposure image fusion,MEF)is one of the effective ways to obtain high dynamic range images(High dynamic range,HDR).The MEF method is committed to solving the problems of loss of image detail information,local brightness and color distortion,fusion halo artifacts,and poor adaptability to multiple scenes.In addition,most of the existing MEF methods require a long sequence of original images with small exposure differences to obtain good fusion results,that is,the input multi-exposure image stack exposures are sequentially composed of under-exposure,medium-exposure,and over-exposure images.While the number if original image is small and there is a large exposure difference,fusion image quality will be significantly reduced.Guided filtering,as a local linear guided filter,has shown great vitality in MEF tasks since its inception.Whether it is for refining the basic weight map or processing image layers in MEF tasks,it has a wide range of applications.In view of the fusion problem existing in the above MEF algorithm,the main work of this paper based on the guided filtering method is as follows:First,for the problem of detail loss in the multi-scale structural patch decomposition multiexposure fusion method(MSPD-MEF),which is an advanced fusion quality with the fastest running time,a multi-exposure fusion with edge-preserving structural patch decomposition(MESPD-MEF)was proposed.While the optimization process of the edge preservation factors introduced is cumbersome and the calculation is large,the regularization constant introduced also has optimization problems with the process of multi-scale decomposition.Therefore,a guided filtering edge-preserving multi-scale SPD-MEF method(GFMSPD-MEF)is proposed.First,the edge-preserving guided filter is incorporated into MSPD-MEF,and it is degenerated into an edge-preserving filter instead of mean filter.In addition,a more flexible adaptive weight is designed to further preserve the details between bright and dark areas,and the proposed method can also be extended to a multi-scale framework,which is suitable for both static and dynamic scenes.Compared with MESPD-MEF,only a slightly higher calculation time is generated,and the generated fusion image has richer detail information and better edge preservation effect.Secondly,aiming at the problem that halo artifacts exist in dual-scale image fusion using traditional mean filtering and image features of different scales need to be guided to filter different filtering radii and blur coefficients,a dual-scale feature enhancement MEF algorithm combined with image layering is proposed.Among them,the method of image decomposition and multi-scale pyramid fusion is used to solve the problem of halo artifact fusion.The weights of various designs are combined in the two dimensions of the basic level and the detail layer to adept different scene images,enhance the image detail feature information and maintain the overall brightness color.Third,the quality of the fused image will decrease significantly when there are few original images with large exposure differences,and the complementary weights proposed in the image fusion method(MGFF)are not conducive to the acquisition under extreme exposure.In order to fully obtain the detailed information of extreme exposure images,guided filtering is used to decompose images at multi-scale,and a multi-scale two-exposure image fusion algorithm based on guided filtering is proposed.The image is decomposed into a base layer and multiple detail layers by guided filtering,and then a step-by-step exposure weight and global gradient weight assignment strategy is designed to mine the image information contained in the base layer and detail layer to reconstruct the image.The fusion image generated by the proposed method under extreme exposure conditions has bright colors,clear details,and natural performance,which is in line with human visual perception habits. |