TY - JOUR
T1 - Difference-Guided Aggregation Network With Multiimage Pixel Contrast for Change Detection
AU - Zhang, Mingwei
AU - Li, Qiang
AU - Miao, Yanling
AU - Yuan, Yuan
AU - Wang, Qi
N1 - Publisher Copyright:
© 1980-2012 IEEE.
PY - 2023
Y1 - 2023
N2 - Change detection is a critical task in remote sensing to monitor the state of the surface on Earth. This field has been dominated by deep learning-based methods recently. Many models that model the temporal-spatial correlation in bitemporal images through the nonlocal interaction between bitemporal features achieve impressive performance. However, under complex scenes including multiple change types or weakly discriminate objects, they suffer from achieving discriminative fusion of information due to the weak semantic discrimination of the bitemporal representations. Aiming at this problem, a difference-guided aggregation network (DGANet) is proposed, where two key modules are injected, i.e., a difference-guided aggregation module (DGAM) and a weighted metric module (WMM). The bitemporal features in DGAM are aggregated with the guidance of their differences, which focuses on their change relevance and relaxes their semantic distinction. Therefore, the fused features are change-relevant and discriminative. WMM aims to achieve adaptive distance computation between the bitemporal features by dynamic feature attention in different dimensions. It is helpful to suppress the pseudo-changes. Besides, a change magnitude contrastive loss (CMCL) is introduced to employ the dependency of bitemporal pixels in different bitemporal images, which further enhances the representation quality of the model. Meanwhile, it is further extended in this work. The effectiveness of the three improvements is demonstrated by extensive ablation studies. The results on three datasets widely used illustrate that our method achieves satisfactory performance.
AB - Change detection is a critical task in remote sensing to monitor the state of the surface on Earth. This field has been dominated by deep learning-based methods recently. Many models that model the temporal-spatial correlation in bitemporal images through the nonlocal interaction between bitemporal features achieve impressive performance. However, under complex scenes including multiple change types or weakly discriminate objects, they suffer from achieving discriminative fusion of information due to the weak semantic discrimination of the bitemporal representations. Aiming at this problem, a difference-guided aggregation network (DGANet) is proposed, where two key modules are injected, i.e., a difference-guided aggregation module (DGAM) and a weighted metric module (WMM). The bitemporal features in DGAM are aggregated with the guidance of their differences, which focuses on their change relevance and relaxes their semantic distinction. Therefore, the fused features are change-relevant and discriminative. WMM aims to achieve adaptive distance computation between the bitemporal features by dynamic feature attention in different dimensions. It is helpful to suppress the pseudo-changes. Besides, a change magnitude contrastive loss (CMCL) is introduced to employ the dependency of bitemporal pixels in different bitemporal images, which further enhances the representation quality of the model. Meanwhile, it is further extended in this work. The effectiveness of the three improvements is demonstrated by extensive ablation studies. The results on three datasets widely used illustrate that our method achieves satisfactory performance.
KW - Adaptive metric
KW - change detection
KW - contrastive loss
KW - difference-guided aggregation
UR - http://www.scopus.com/inward/record.url?scp=85161044792&partnerID=8YFLogxK
U2 - 10.1109/TGRS.2023.3278739
DO - 10.1109/TGRS.2023.3278739
M3 - 文章
AN - SCOPUS:85161044792
SN - 0196-2892
VL - 61
JO - IEEE Transactions on Geoscience and Remote Sensing
JF - IEEE Transactions on Geoscience and Remote Sensing
M1 - 5611114
ER -