TY - JOUR
T1 - SegCR
T2 - A Multimodal and Multitask Complementary Fusion Network for Remote Sensing Semantic Segmentation and Cloud Removal
AU - Wu, Shicheng
AU - Zhu, Jinbiao
AU - Gu, Yanlin
AU - Han, Wenqi
AU - Jiang, Wen
AU - Geng, Jie
N1 - Publisher Copyright:
© 1980-2012 IEEE.
PY - 2025
Y1 - 2025
N2 - The synthetic aperture radar (SAR) can provide complementary information to optical images due to its insensitivity to atmospheric conditions, making optical–SAR fusion semantic segmentation a popular research topic. However, existing optical–SAR fusion methods overlook cloud interference in real-world scenarios, leading to the suboptimal performance in practical applications. To address the performance degradation of existing optical–SAR fusion semantic segmentation methods under cloud interference, we propose SegCR, a multimodal and multitask framework that leverages the complementary information of SAR images to reduce the negative influence of clouds in optical images. Specifically, we extract a cloud impact map using the frequency differences between optical and SAR features, which represents the extent of cloud interference for every pixel in the optical image. Next, we introduce an SAR-to-OPT translation subtask, leveraging SAR features to produce optical simulated features that supplement the missing information in the cloud-affected optical features. Then, based on the cloud impact map, we perform a targeted complementary fusion of optical and SAR features. Finally, we design a multitask learning framework that simultaneously performs the semantic segmentation and cloud removal, enabling the high-level semantic understanding task and the low-level vision task to enhance each other through multitask learning. Extensive comparison experiments on two publicly multitask remote sensing datasets reveal that our proposed SegCR outperforms the existing state-of-the-art (SOTA) methods. In addition, ablation experiments have confirmed the effectiveness of the proposed module and the multitask learning framework.
AB - The synthetic aperture radar (SAR) can provide complementary information to optical images due to its insensitivity to atmospheric conditions, making optical–SAR fusion semantic segmentation a popular research topic. However, existing optical–SAR fusion methods overlook cloud interference in real-world scenarios, leading to the suboptimal performance in practical applications. To address the performance degradation of existing optical–SAR fusion semantic segmentation methods under cloud interference, we propose SegCR, a multimodal and multitask framework that leverages the complementary information of SAR images to reduce the negative influence of clouds in optical images. Specifically, we extract a cloud impact map using the frequency differences between optical and SAR features, which represents the extent of cloud interference for every pixel in the optical image. Next, we introduce an SAR-to-OPT translation subtask, leveraging SAR features to produce optical simulated features that supplement the missing information in the cloud-affected optical features. Then, based on the cloud impact map, we perform a targeted complementary fusion of optical and SAR features. Finally, we design a multitask learning framework that simultaneously performs the semantic segmentation and cloud removal, enabling the high-level semantic understanding task and the low-level vision task to enhance each other through multitask learning. Extensive comparison experiments on two publicly multitask remote sensing datasets reveal that our proposed SegCR outperforms the existing state-of-the-art (SOTA) methods. In addition, ablation experiments have confirmed the effectiveness of the proposed module and the multitask learning framework.
KW - Cloud removal
KW - multimodal fusion
KW - multitask learning
KW - remote sensing
KW - semantic segmentation
UR - https://www.scopus.com/pages/publications/105014643908
U2 - 10.1109/TGRS.2025.3603066
DO - 10.1109/TGRS.2025.3603066
M3 - 文章
AN - SCOPUS:105014643908
SN - 0196-2892
VL - 63
JO - IEEE Transactions on Geoscience and Remote Sensing
JF - IEEE Transactions on Geoscience and Remote Sensing
M1 - 5640014
ER -