SAR Image Despeckling with Residual-in-Residual Dense Generative Adversarial Network

Yunpeng Bai, Yayuan Xiao, Xuan Hou, Ying Li, Changjing Shang, Qiang Shen

科研成果: 期刊稿件会议文章同行评审

2 引用 (Scopus)

摘要

Deep convolutional neural networks have delivered remarkable aptitude in performing Synthetic Aperture Radar (SAR) image speckle removal tasks. Such approaches are nevertheless constrained in balancing speckle removal and preservation of spatial information, particularly with respect to strong speckle noise. In this paper, a novel residual-in-residual dense generative adversarial network is proposed to effectively suppress SAR image speckle while retaining rich spatial information. A despeckling sub-network composed of residual-in-residual dense blocks with an encoder-decoder structure is devised to learn end-to-end mapping of noisy images onto noise-free images, where the combination of residual-in-residual structure and dense connection significantly enhances the feature representation capability. In addition, a discriminator sub-network with a fully convolutional structure is introduced, and the adversarial learning strategy is adopted to continuously refine the quality of despeckled results. Systematic experimental results on simulated and real SAR images demonstrate that the novel approach offers superior performance in both quantitative and visual evaluation as compared to state-of-the-art methods.

指纹

探究 'SAR Image Despeckling with Residual-in-Residual Dense Generative Adversarial Network' 的科研主题。它们共同构成独一无二的指纹。

引用此