| With the development of satellite sensor technology,higher resolution and more spectra have become the trend of remote sensing images.Correspondingly,remote sensing image processing technology is becoming more and more important.Compared with general images,the scale differences between different semantic in remote sensing images are quite large,so it is difficult to extract such large-scale context dependencies,which is an important breakthrough point in remote sensing image recognition.Existing classification schemes for land cover recognition can be divided into pixel-level,scene-level,and object-level classification according to the scale for classification.To explore the importance of multi-scale information,this paper proposes a solution for pixel-level and scene-level classification respectively.The specific work is as follows:1.To solve the scene-level classification problem,a new machine learning image classification method based on the gc Forest method is proposed.The main improvement of this model is a scanning module that can effectively extract multi-dimensional and multi-scale information and an efficient classification module.The proposed scanning structure can extract multi-scale information from multi-dimensional remote sensing images,and the proposed classifier’s innovative residual learning method can extract features well.Experiments show that the scene-level classification method proposed in this paper has a better classification effect on remote sensing scenes than most existing models.2.To solve the pixel-level classification problem,we proposed a novel multi-scale information extracting algorithm.A multi-scale prediction module is designed to utilize context information,which improves the accuracy by predicting the multi-scale results through a multi-label design.In addition,a multi-scale probability map spatial fusion module is used to fuse the correlation information between multi-scale maps.Qualitative and quantitative analyses indicate that our model can extract the context information effectively. |