TY - JOUR
T1 - MS-GAN
T2 - Learn to Memorize Scene for Unpaired SAR-to-Optical Image Translation
AU - Guo, Zhe
AU - Zhang, Zhibo
AU - Cai, Qinglin
AU - Liu, Jiayi
AU - Fan, Yangyu
AU - Mei, Shaohui
N1 - Publisher Copyright:
© 2008-2012 IEEE.
PY - 2024
Y1 - 2024
N2 - Synthetic aperture radar (SAR) and optical sensing are two important means of Earth observation. SAR-to-optical image translation (S2OIT) can integrate the advantages of both and assist SAR image interpretation under all-day and all-weather conditions. The existing S2OIT methods generally follow a paired training paradigm, which is difficult when dealing with the unpaired S2OIT application scenarios. Moreover, the generator and discriminator in current S2OIT methods have insufficient scene memory for SAR images, resulting in regional landform deformation in the generated images. To address these issues, we propose a novel generative adversarial network capable of memorizing scene for unpaired S2OIT called MS-GAN. The cycle learning framework based on cycle generative adversarial network for unpaired S2OIT is designed to construct the translation mapping between unpaired SAR and optical images. The multiscale representation generator is constructed for multiscale fusion and utilization of scene features of SAR images. The proposed multireceptive field discriminator has the ability to enhance scene memory and generate higher quality optical images in different landforms. In addition, the designed subbands shrinkage denoising module can further suppress the effect of speckle noise in SAR images on the quality of the generated results. Extensive experiments conducted on three challenging datasets SEN1-2, WHU-SEN-City, and QXS-SAROPT demonstrate that the proposed MS-GAN outperforms the state-of-the-art methods on both subjective and objective evaluation metrics.
AB - Synthetic aperture radar (SAR) and optical sensing are two important means of Earth observation. SAR-to-optical image translation (S2OIT) can integrate the advantages of both and assist SAR image interpretation under all-day and all-weather conditions. The existing S2OIT methods generally follow a paired training paradigm, which is difficult when dealing with the unpaired S2OIT application scenarios. Moreover, the generator and discriminator in current S2OIT methods have insufficient scene memory for SAR images, resulting in regional landform deformation in the generated images. To address these issues, we propose a novel generative adversarial network capable of memorizing scene for unpaired S2OIT called MS-GAN. The cycle learning framework based on cycle generative adversarial network for unpaired S2OIT is designed to construct the translation mapping between unpaired SAR and optical images. The multiscale representation generator is constructed for multiscale fusion and utilization of scene features of SAR images. The proposed multireceptive field discriminator has the ability to enhance scene memory and generate higher quality optical images in different landforms. In addition, the designed subbands shrinkage denoising module can further suppress the effect of speckle noise in SAR images on the quality of the generated results. Extensive experiments conducted on three challenging datasets SEN1-2, WHU-SEN-City, and QXS-SAROPT demonstrate that the proposed MS-GAN outperforms the state-of-the-art methods on both subjective and objective evaluation metrics.
KW - cycle generative adversarial network (CycleGAN)
KW - multi-scale fusion
KW - multireceptive field
KW - scene memory
KW - Synthetic aperture radar (SAR)-to-optical image translation (S2OIT)
UR - http://www.scopus.com/inward/record.url?scp=85196101303&partnerID=8YFLogxK
U2 - 10.1109/JSTARS.2024.3411691
DO - 10.1109/JSTARS.2024.3411691
M3 - 文章
AN - SCOPUS:85196101303
SN - 1939-1404
VL - 17
SP - 11467
EP - 11484
JO - IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing
JF - IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing
ER -