| Remote sensing data is one of the main data sources of geographic information application,widely used in object detection,landcover classification,disaster monitoring and other fields.However,due to technological limitations and budget constraints,there is often a tradeoff between the temporal resolution and spatial resolution of satellite images,which makes it difficult to obtain remote sensing images with both high temporal resolution and high spatial resolution.Moreover,the optical remote sensing images are susceptible to cloud coverage,resulting in the deterioration of image quality and the reduction of available data,which limits the application of remote sensing data.Therefore,the research on multi-source remote sensing image spatiotemporal fusion method which can generate both high temporal resolution and high spatial resolution is needed for scientific research and applications.Among the pre-existing multi-source remote sensing image spatiotemporal fusion methods,the spatiotemporal fusion methods based on two prior image pairs can’t extract the deep information of the image due to the spatial resolution gap between the multi-source remote sensing images,which leads to the low accuracy of spatiotemporal fusion results and high computational consumption.The spatiotemporal fusion methods based on one prior image pair are limited by insufficient priori information,leading to the poor visual effect of the spatiotemporal fusion results,without fusion precision guarantee.Considering these issues,this paper uses the generative adversarial network to carry out research on multi-source remote sensing image spatiotemporal fusion methods based on two prior image pairs and one prior image pair.The contributions of this paper are as follows:(1)In view of the problem that the pre-existing two prior image pairs-based spatiotemporal fusion methods haven’t fully dig the deep information of the image,this paper proposes a two-branch image fusion network based on generative adversarial network(IFGAN).IFGAN extracts the content information from the multi-temporal low resolution images,and learns the structural features from the high-resolution images.Finally,the high-temporal-high-spatial resolution image is reconstructed by feature fusion.A two-stage framework is proposed to improve the spatial resolution by stages.In this way,a two prior image pairs-based spatiotemporal fusion method via generative adversarial network(STFGAN)is developed.(2)Considering that the pre-existing one prior image pair-based spatiotemporal fusion methods are restricted to the limited priori information,this paper proposes a change compensation spatiotemporal fusion framework,based on which a generative adversarial network-based spatiotemporal fusion model is built.Starting from the data characteristics,the model extracts the sensor difference,temporal change and base information to further deeply integrates the Landsat data and MODIS data.Moreover,this paper proposes a change loss to facilitates the extraction of the temporal change information,so as to acquire high-temporal-high-spatial resolution images with high fidelity.A one prior image pair-based spatiotemporal fusion method successfully designed by using generative adversarial network(OPGAN).In order to verify the performance of the above two prior image pair-based and one prior image pair-based spatiotemporal fusion methods using generative adversarial network,this paper conducts experiments on the spatiotemporal fusion dataset containing three study sites.The experimental results clearly illustrate the effectiveness of the proposed STFGAN method from both spectral and spatial perspective,as well as the resolution improvement ability of the two-stage framework.In addition,compared with the pre-existing spatiotemporal fusion method based on one prior image pair,OPGAN has achieved a great improvement in both qualitative and quantitative evaluation,and its effect is even comparable to that of the two prior image pair-based spatiotemporal fusion methods.The effectiveness of the proposed change loss is proved by ablation experiments. |