TY - JOUR
T1 - Dehaze-AGGAN
T2 - Unpaired Remote Sensing Image Dehazing Using Enhanced Attention-Guide Generative Adversarial Networks
AU - Zheng, Yitong
AU - Su, Jia
AU - Zhang, Shun
AU - Tao, Mingliang
AU - Wang, Ling
N1 - Publisher Copyright:
© 1980-2012 IEEE.
PY - 2022
Y1 - 2022
N2 - Remote sensing image dehazing is of great scientific interest and application value in both military and civil fields. In this article, we propose an enhanced attention-guide generative adversarial network (GAN) network, Dehaze-AGGAN, to solve the remote sensing images dehazing problem, which does not require paired training data. Since haze images have a great influence on remote sensing object detection, the dehazing of remote sensing images has become significantly important. Typical image dehazing methods require a hazy input image and its ground truth in a paired manner, while paired training data are usually not available in the field of remote sensing. To solve this problem, we propose the Dehaze-AGGAN network and train it by feeding unpaired clean and hazy images into the model. We present a novel total variation loss combined with the cycle consistency loss to eliminate wave noise and improve the target edge quality in the test dataset. Moreover, we present a new dehazing dataset called remote sensing dehazing dataset (RSD), which contains 7000 simulate and real hazy images including 3500 warship images and 3500 civilian ship images, and evaluate our method in the dataset. We conduct experiments on RSD. Extensive experiments demonstrate that the proposed Dehaze-AGGAN is effective and has strong robustness and adaptability in different settings.
AB - Remote sensing image dehazing is of great scientific interest and application value in both military and civil fields. In this article, we propose an enhanced attention-guide generative adversarial network (GAN) network, Dehaze-AGGAN, to solve the remote sensing images dehazing problem, which does not require paired training data. Since haze images have a great influence on remote sensing object detection, the dehazing of remote sensing images has become significantly important. Typical image dehazing methods require a hazy input image and its ground truth in a paired manner, while paired training data are usually not available in the field of remote sensing. To solve this problem, we propose the Dehaze-AGGAN network and train it by feeding unpaired clean and hazy images into the model. We present a novel total variation loss combined with the cycle consistency loss to eliminate wave noise and improve the target edge quality in the test dataset. Moreover, we present a new dehazing dataset called remote sensing dehazing dataset (RSD), which contains 7000 simulate and real hazy images including 3500 warship images and 3500 civilian ship images, and evaluate our method in the dataset. We conduct experiments on RSD. Extensive experiments demonstrate that the proposed Dehaze-AGGAN is effective and has strong robustness and adaptability in different settings.
KW - Attention guided
KW - dehaze
KW - generative adversarial networks (GANs)
KW - total variation loss
UR - http://www.scopus.com/inward/record.url?scp=85137916262&partnerID=8YFLogxK
U2 - 10.1109/TGRS.2022.3204890
DO - 10.1109/TGRS.2022.3204890
M3 - 文章
AN - SCOPUS:85137916262
SN - 0196-2892
VL - 60
JO - IEEE Transactions on Geoscience and Remote Sensing
JF - IEEE Transactions on Geoscience and Remote Sensing
M1 - 5630413
ER -