TY - JOUR
T1 - A hybrid generative adversarial network for weakly-supervised cloud detection in multispectral images
AU - Li, Jun
AU - Wu, Zhaocong
AU - Sheng, Qinghong
AU - Wang, Bo
AU - Hu, Zhongwen
AU - Zheng, Shaobo
AU - Camps-Valls, Gustau
AU - Molinier, Matthieu
A2 - Chen, Jing M.
N1 - Funding Information:
We would like to thank Zhe Zhu and Shi Qiu (University of Connecticut) for providing the Fmask 4.0 validation dataset. The authors are grateful for the Sentinel-2 data services from Copernicus Open Access Hub. This work is supported in part by the Natural Science Foundation of Jiangsu Province (No. BK20220888 ), in part by Fundamental Research Funds for the Central Universities (No. NT2022022 ), in part by the National Key Laboratory Foundation 2021-JCJQ-LB-006 (Grant No. 6142411442120 ) and in part by the National Key Laboratory Foundation (Contract No. 6142411203416 ). Gustau Camps-Valls would like to acknowledge the support from the European Research Council (ERC) under the ERC Synergy Grant USMILE (grant agreement 855187) and DEEPCLOUD project funded by the Spanish Ministry of Science, Innovation and Universities (MCIU/AEI/FEDER, UE) PID2019-109026RB-I00 . The work of Matthieu Molinier is part of the Finnish Flagship Programme FCAI: Finnish Center for Artificial Intelligence, by the Academy of Finland.
Funding Information:
We would like to thank Zhe Zhu and Shi Qiu (University of Connecticut) for providing the Fmask 4.0 validation dataset. The authors are grateful for the Sentinel-2 data services from Copernicus Open Access Hub. This work is supported in part by the Natural Science Foundation of Jiangsu Province (No. BK20220888), in part by Fundamental Research Funds for the Central Universities (No. NT2022022), in part by the National Key Laboratory Foundation 2021-JCJQ-LB-006 (Grant No. 6142411442120) and in part by the National Key Laboratory Foundation (Contract No. 6142411203416). Gustau Camps-Valls would like to acknowledge the support from the European Research Council (ERC) under the ERC Synergy Grant USMILE (grant agreement 855187) and DEEPCLOUD project funded by the Spanish Ministry of Science, Innovation and Universities (MCIU/AEI/FEDER, UE) PID2019-109026RB-I00. The work of Matthieu Molinier is part of the Finnish Flagship Programme FCAI: Finnish Center for Artificial Intelligence, by the Academy of Finland.
Publisher Copyright:
© 2022 The Authors
PY - 2022/10
Y1 - 2022/10
N2 - Cloud detection is a crucial step in the optical satellite image processing pipeline for Earth observation. Clouds in optical remote sensing images seriously affect the visibility of the background and greatly reduce the usability of images for land applications. Traditional methods based on thresholding, multi-temporal or multi-spectral information are often specific to a particular satellite sensor. Convolutional Neural Networks for cloud detection often require labeled cloud masks for training that are very time-consuming and expensive to obtain. To overcome these challenges, this paper presents a hybrid cloud detection method based on the synergistic combination of generative adversarial networks (GAN) and a physics-based cloud distortion model (CDM). The proposed weakly-supervised GAN-CDM method (available online https://github.com/Neooolee/GANCDM) only requires patch-level labels for training, and can produce cloud masks at pixel-level in both training and testing stages. GAN-CDM is trained on a new globally distributed Landsat 8 dataset (WHUL8-CDb, available online doi:https://doi.org/10.5281/zenodo.6420027) including image blocks and corresponding block-level labels. Experimental results show that the proposed GAN-CDM method trained on Landsat 8 image blocks achieves much higher cloud detection accuracy than baseline deep learning-based methods, not only in Landsat 8 images (L8 Biome dataset, 90.20% versus 72.09%) but also in Sentinel-2 images (“S2 Cloud Mask Catalogue” dataset, 92.54% versus 77.00%). This suggests that the proposed method provides accurate cloud detection in Landsat images, has good transferability to Sentinel-2 images, and can quickly be adapted for different optical satellite sensors.
AB - Cloud detection is a crucial step in the optical satellite image processing pipeline for Earth observation. Clouds in optical remote sensing images seriously affect the visibility of the background and greatly reduce the usability of images for land applications. Traditional methods based on thresholding, multi-temporal or multi-spectral information are often specific to a particular satellite sensor. Convolutional Neural Networks for cloud detection often require labeled cloud masks for training that are very time-consuming and expensive to obtain. To overcome these challenges, this paper presents a hybrid cloud detection method based on the synergistic combination of generative adversarial networks (GAN) and a physics-based cloud distortion model (CDM). The proposed weakly-supervised GAN-CDM method (available online https://github.com/Neooolee/GANCDM) only requires patch-level labels for training, and can produce cloud masks at pixel-level in both training and testing stages. GAN-CDM is trained on a new globally distributed Landsat 8 dataset (WHUL8-CDb, available online doi:https://doi.org/10.5281/zenodo.6420027) including image blocks and corresponding block-level labels. Experimental results show that the proposed GAN-CDM method trained on Landsat 8 image blocks achieves much higher cloud detection accuracy than baseline deep learning-based methods, not only in Landsat 8 images (L8 Biome dataset, 90.20% versus 72.09%) but also in Sentinel-2 images (“S2 Cloud Mask Catalogue” dataset, 92.54% versus 77.00%). This suggests that the proposed method provides accurate cloud detection in Landsat images, has good transferability to Sentinel-2 images, and can quickly be adapted for different optical satellite sensors.
KW - Cloud detection
KW - Cloud distortion model
KW - Deep learning
KW - Generative adversarial networks (GAN)
KW - Remote sensing
UR - http://www.scopus.com/inward/record.url?scp=85136277692&partnerID=8YFLogxK
U2 - 10.1016/j.rse.2022.113197
DO - 10.1016/j.rse.2022.113197
M3 - Article
C2 - 36193118
AN - SCOPUS:85136277692
SN - 0034-4257
VL - 280
JO - Remote Sensing of Environment
JF - Remote Sensing of Environment
M1 - 113197
ER -