Dehaze-TGGAN: Transformer-Guide Generative Adversarial Networks With Spatial-Spectrum Attention for Unpaired Remote Sensing Dehazing

Yitong Zheng, Jia Su, Shun Zhang, Mingliang Tao, Ling Wang

科研成果: 期刊稿件文章同行评审

8 引用 (Scopus)

摘要

Satellite imagery plays a critical role in target detection. However, the quality and usability of optical remote sensing images can be severely compromised by atmospheric conditions, particularly haze, which significantly reduces the recognition accuracy of target detection algorithms such as ships. On the other hand, paired training data, i.e., the remote sensing data with or without fog at the same place, are difficult to obtain in real-world scenarios, leading to the failure of many existing dehazing methods. To deal with these issues, this article proposes a Transformer-Guide CycleGAN framework generative adversarial networks (Dehaze-TGGAN) incorporating an extra attention mechanism from the frequency domain. First, an SSA mechanism is proposed by using a 2-D fast Fourier transform (2D FFT) in the spatial domain, which enables the model to understand the relationships within the three-channel frequency domain information and to recover the spectral features of the hazy image through the spectrum encoder block. Then, a pre-training approach using semi-transparent masks (STM), which can effectively simulate hazy conditions by adjusting the transparency of masks, is presented as a key strategy to accelerate the convergence rate. Finally, the applicability of the transformer architecture is extended by incorporating total variation loss (TV Loss). The results of simulated and measured optical remote sensing data show that the recognition accuracy and the efficiency of the proposed algorithm are greatly improved.

源语言英语
文章编号5634320
期刊IEEE Transactions on Geoscience and Remote Sensing
62
DOI
出版状态已出版 - 2024

指纹

探究 'Dehaze-TGGAN: Transformer-Guide Generative Adversarial Networks With Spatial-Spectrum Attention for Unpaired Remote Sensing Dehazing' 的科研主题。它们共同构成独一无二的指纹。

引用此