WANG Z P, TANG C Y, DENG T L, CUI H D, ZHANG Y Z, LIU L L. Integration of deep learning and change detection for flood-damaged greenhouse extraction using drone multispectral imageryJ. Chinese Journal of Eco-Agriculture, 2026, 33(1): 1−11. DOI: 10.12357/cjea.20250247
Citation: WANG Z P, TANG C Y, DENG T L, CUI H D, ZHANG Y Z, LIU L L. Integration of deep learning and change detection for flood-damaged greenhouse extraction using drone multispectral imageryJ. Chinese Journal of Eco-Agriculture, 2026, 33(1): 1−11. DOI: 10.12357/cjea.20250247

Integration of deep learning and change detection for flood-damaged greenhouse extraction using drone multispectral imagery

  • In recent decades, flood disasters have occurred frequently worldwide, causing serious damage to greenhouses, as agricultural production is vulnerable to damage from flood disasters. Therefore, rapid and accurate extraction of spatial location information from damaged greenhouses is of great significance for disaster loss assessment and post-disaster reconstruction. With the development of computer technology, deep learning has become widely applied. To assess the damaged greenhouses by flood disaster, we chose the Yangying Village, Wuqing District, Tianjin City that was flooded from the end of July to August 2023 because of the catastrophic “23·7” basin-wide flood in the Haihe River Basin. Using drone-bound multispectral remote sensing images, we efficiently detected damaged greenhouses with a deep learning approach. First, multidimensional spatial features were constructed to select better bands and indices for detecting greenhouses based on spectral reflectance profiles and separability measures. Greenhouses were then detected in different resolution images using three types of deep learning networks. Finally, the damaged greenhouses were identified more correctly and efficiently according to the status changes in the greenhouses detected from the images obtained during different periods. As a result, the blue light band, green light band, and NDVI index were sensitive parameters with higher spectral separability for greenhouse extraction; the blue light band exhibited the highest separability, whereas the red, red-edge, and near-infrared light bands showed lower separability. Three networks, Seg-UNet, Seg-UNet++, and DeepLab V3+, had overall greenhouse recognition accuracies and Kappa coefficients better than 97% and 0.8, respectively. The model trained with the Seg-UNet network demonstrated the best performance in greenhouse identification, achieving the highest classification accuracy with an optimal epoch number of 40. For image resolutions ranging from 0.1–2.0 m, the model with 0.2 m resolution achieved the highest accuracy, with the overall accuracy of 99.02% and a Kappa coefficient of 0.93, respectively. For greenhouse images in different periods, the base model was fine-tuned through transfer learning to improve the overall accuracy and Kappa coefficient of the modified base model. Using change detection and comparing the greenhouse detection results during the non-flood and flood periods, damaged greenhouses were identified with an overall accuracy of 98.87% and a Kappa coefficient of 0.80. This study provides valuable insights into the application of drone-bound multispectral imagery for detecting damaged greenhouses, assessing disaster impacts, and supporting science-based post-disaster reconstruction.
  • loading

Catalog

    /

    DownLoad:  Full-Size Img  PowerPoint
    Return
    Return