Font Size: a A A

Construction And Yield Estimation Of Low Altitude Remote Sensing Recognition Features For Characteristic Crops In Karst Mountainous Areas Under Complex Scene Classification

Posted on:2024-11-12Degree:DoctorType:Dissertation
Country:ChinaCandidate:R W PengFull Text:PDF
GTID:1523307355468184Subject:Cartography and Geographic Information System
Abstract/Summary:PDF Full Text Request
In karst mountainous areas,cultivated land is fragmented and crops are often planted in complex and variable surface habitats.In these scenarios,crop growth identification and monitoring have always been a challenge.Due to the specificity and diversity of the surface environment,traditional ground monitoring methods often struggle to comprehensively and accurately obtain information on crop growth.The emergence of low-altitude remote sensing technology for drones has provided a new solution for agricultural remote sensing.Low-altitude remote sensing technology for drones has been widely used in modern precision agriculture due to its convenient use advantages and high-resolution characteristics of data acquisition.Low-altitude remote sensing data sources have different data characteristics for different ground features identification,especially for characteristic crops on complex surfaces in karst mountainous areas.When implementing identification and extraction through low-altitude remote sensing data sources,they are easily affected by weeds,other ground features,and other crops.Moreover,under different complex scenarios,the effectiveness of the extraction methods used may vary.Therefore,unlike the vast cultivated land in plain areas,karst mountainous areas urgently need to target planting plots and carry out classification for different surface composition complex plots,in order to deeply study and explore the inherent laws and connections of low-altitude remote sensing data,so as to better apply it to the construction and extraction of characteristic crop features in different methods and different habitats in karst mountainous areas.Therefore,the study takes the typical economic crop pitaya plant in complex surface habitats as the example research object,and separates the pitaya planting plots in the study area into six types of complex scenes based on their surface habitats: only similar color complex scenes,only terrain change complex scenes,only multi-type ground objects coexistence complex scenes,similar color-terrain change complex scenes,terrain change-multi-type ground objects coexistence complex scenes,and bare soil-vegetation type scenes.Through low-altitude remote sensing by UAV,RGB orthographic images and photogrammetric point cloud data are obtained,and the spectral characteristics of pitaya in two-dimensional RGB image data,the three-dimensional and planar characteristics of pitaya point cloud data,and the deep learning feature sample set of pitaya are constructed.Three methods based on spectral vegetation index,CHM model,and U-Net deep learning model are used to identify and extract pitaya plants.Compared with the field collected validation data,statistics are conducted from three aspects: error rate,missing rate,and overall accuracy rate to explore the influence and suitability of multi-source data features,different surface complex scenes,and different methods on the extraction of characteristic crops.Based on the extraction of pitaya plant numbers,a pitaya yield estimation model is constructed using multi-period field measurement data of pitaya sample plants to estimate yield and evaluate its efficiency.The main conclusions of this article are as follows:(1)The successful use of vegetation indices to construct RGB spectral recognition features and extract fire pitches from different scenes has been analyzed.The calculation results of five vegetation indices show that the IVDVI index is the best(average total accuracy rate of 80.69% to 95.08%),while the NGBDI index is the worst(73.62%,~89.68%).The average total accuracy rate of target crop extraction for the VDVI index(79.57% to 94.54%)is second only to the IVDVI and better than other indices.The IVDVI index does indeed play a certain role in improving the VDVI index.The RGBVI index is better than the NGRDI index in simple scenes,but not as good as the NGRDI index in more complex scenes.The extraction results of vegetation indices based on spectra for targets are greatly affected by other ground objects.The results are better in plots with simple ground objects,but there is some uncertainty in the extraction results in more complex scenes.(2)Based on the point cloud data of the drone,the point cloud height difference characteristics of the pitaya plant and the interference objects are constructed,and the CHM model is used to segment the point cloud data to extract the pitaya plant.The extraction of the plant is affected by the complexity of the plant growth plot,but it is less affected by the interference of weeds or other crops with similar colors.Among the six complex scene plots,the average total accuracy is only 93.90% for bare soil-vegetation scene plots,91.47% for similar color scene plots,89.94% for terrain change scene plots,88.97% for multi-type ground objects coexistence scene plots,87.72% for terrain change-multi-type ground objects coexistence complex scene plots,and 80.65% for similar color-terrain change scene plots.This indicates that the CHM model segmentation method is more affected by terrain changes when segmenting targets,and it performs better in scenes with similar colors.(3)The learning samples and supplementary samples of pitaya plants were successfully constructed,and the pitaya plants were successfully extracted using the U-Net deep learning model.Through the experimental results,it was found that among the six complex scene plots,the overall accuracy was only close to the color scene(86.80%)< similar color-terrain change scene plot(87.22%)< terrain change-multi-class ground coexistence complex scene plot(88.32%)< multi-class ground coexistence scene plot(90.83%)< terrain change scene plot(92.58%)< bare soil-vegetation scene plot(94.60%),indicating that the segmentation method based on U-net deep learning has a high overall accuracy.(4)By integrating the three methods of extracting pitaya plants,the U-Net deep learning model and the spectral-based vegetation index are more suitable for complex plots without similar colors,while the CHM model is more suitable for complex plots with similar colors.The segmentation method of the CHM model has better segmentation results in all complex scene plots except for the bare soil-vegetation scene plot,where the segmentation results are worse than those based on the spectral-based vegetation index.The overall segmentation results of the U-Net deep learning method are as affected by strong interference from similar colors as those based on the spectral-based vegetation index,but except for the bare soil-vegetation scene plot,where the segmentation results are worse than those based on the CHM model,the segmentation results in all other complex scene plots are better than those based on the spectral-based vegetation index and the CHM model.(5)The low-altitude remote sensing identification of plant number data and manual measurement data were used to construct a yield estimation model for the Sunshine Orchard and Xinzhongsheng Dragon Fruit Production Base,and the accuracy was tested.The results showed that the yield estimation model could accurately predict the yield of dragon fruit.From the experimental results,the methods based on manual labeling and low-altitude remote sensing identification were 89.13% and 85.67%(Sunshine Orchard)and 88.48% and 87.81%(Xinzhongsheng Dragon Fruit Production Base),respectively.The results showed that the yield estimation model could accurately predict the number of plants and yield of dragon fruit,and from the perspective of time consumption,the low-altitude remote sensing identification method was significantly less time-consuming than manual labeling,which could effectively improve the efficiency of yield estimation.
Keywords/Search Tags:Low-altitude remote sensing, Characteristic crops, Identification feature construction, Rapid yield estimation
PDF Full Text Request
Related items