Improved cGAN for SAR-to-Optical Image Translation

Pengcheng Hu, Yong Wang, Yifan Liu, Xinxin Guo, Yongkang Wang, Rongxin Cui

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

Synthetic aperture radar (SAR) can be used for all-day and all-weather Earth observation, but it has the disadvantages of speckle noise and geometric distortion, which are not conducive to human eye recognition. In order to enhance the visual effect of SAR images, this paper proposes an improved cGAN(Conditional Generative Adversarial Network) method to achieve the translation of SAR images to optical images. Firstly, the generator uses U-Net structure to combine global features with local features, which improves the details of the generated image. Secondly, the discriminator uses PatchGAN structure to extract and characterize the local image features, and finely distinguish each part of the image. Finally, SSIM and PSNR loss functions are added to improve the restoration degree of the generated image. In the experiment on SEN1-2 dataset, our method surpasses the basic cGAN and pix2pix. The translated image retains the key content of SAR image, and also has the style of optical image.

Original languageEnglish
Title of host publicationProceedings of the 43rd Chinese Control Conference, CCC 2024
EditorsJing Na, Jian Sun
PublisherIEEE Computer Society
Pages7675-7680
Number of pages6
ISBN (Electronic)9789887581581
DOIs
StatePublished - 2024
Event43rd Chinese Control Conference, CCC 2024 - Kunming, China
Duration: 28 Jul 202431 Jul 2024

Publication series

NameChinese Control Conference, CCC
ISSN (Print)1934-1768
ISSN (Electronic)2161-2927

Conference

Conference43rd Chinese Control Conference, CCC 2024
Country/TerritoryChina
CityKunming
Period28/07/2431/07/24

Keywords

  • cGAN(Conditional Generative Adversarial Network)
  • deep learning
  • SAR-to-optical image translation

Fingerprint

Dive into the research topics of 'Improved cGAN for SAR-to-Optical Image Translation'. Together they form a unique fingerprint.

Cite this