Font Size: a A A

Application Of Multi-scale Semantic Segmentation Network In Fusion Remote Sensing Terrain Classification

Posted on:2022-11-15Degree:MasterType:Thesis
Country:ChinaCandidate:Z M MengFull Text:PDF
GTID:2492306779496684Subject:Automation Technology
Abstract/Summary:PDF Full Text Request
The Great Bay area is one of the regions with the most economic vitality and the fastest development in China.The results of remote sensing image terrain classification can provide basic data support for urban planning and land use monitoring.Synthetic aperture radar system is less affected by weather,light and clouds.It is suitable for remote sensing information acquisition in this area.Semantic segmentation network can complete the classification of ground objects in synthetic aperture radar images,but there are some problems,such as different scale changes of ground objects,less training samples in synthetic aperture radar remote sensing image data set,and lack of spectral information in synthetic aperture radar images.The main research contents of this thesis are as follows:(1)Aiming at the problem of different scale changes of ground objects in radar satellite observation scene,a semantic segmentation model ENet-CSPP applied to terrain classification is proposed.Taking advantage of the characteristics that ordinary convolution maintains domain information better than atrous convolution,a multi-scale feature fusion module convolution spatial pyramid pooling module is proposed.Aiming at the problem that the training samples of synthetic aperture radar remote sensing image data set are few,the data set is expanded by using data expansion,and a method combining multi-scale feature fusion module and lightweight convolution neural network is proposed to reduce the impact of fewer training samples on the accuracy of ground object classification.The encoder part of ENet-CSPP network is composed of improved ENet network and convolution spatial pyramid pooling module.The decoder part realizes the fusion of deep and shallow features and outputs ground terrain classification images.The ENet-CSPP model uses the crossentropy loss function with category weight to improve the classification accuracy of small ground object targets.Quantitative comparison experiments are carried out on GDUTNansha V1.0 data set.ENet-CSPP model is superior to other models in four performance indexes: pixel accuracy,average pixel accuracy,mean intersection over union and kappa coefficient,which shows that the multi-scale lightweight model effectively improves the accuracy of ground terrain classification.(2)Most of the existing classical semantic networks use optical image data sets to complete the pre training.In order to make full use of the pre training weight of these models,it is necessary to fuse synthetic aperture images with optical remote sensing images.Using the image fusion method based on IHS transform and PCA transform,the Terra SAR-X remote sensing satellite image and Sentinel-2 RGB image are fused,and the new image is made into the fusion image data set GDUT-Nansha-FU V1.0.Quantitative comparison experiments are carried out on the fused image data set using semantic segmentation models such as ENet-CSPP model.The model is better than GDUT-Nansha V1.0 data set in four evaluation indexes: pixel accuracy,average pixel accuracy,mean intersection over union and category intersection over union.(3)Through comparative experiments,the effects of image fusion method and SAR image polarization mode on the accuracy of model ground object classification are studied.The experimental results show that the semantic segmentation models such as ENet-CSPP achieve relatively optimal classification accuracy under the fusion image data set based on IHS transformation and HH polarization data.The pixel accuracy,average pixel accuracy,mean intersection over union and category intersection over union predicted on this data set are better than other fused image data sets.
Keywords/Search Tags:Synthetic aperture radar, Terrain classification, Semantic segmentation, Remote sensing image fusion, Multi-scale
PDF Full Text Request
Related items