Style Transfer-Based Unsupervised Change Detection from Heterogeneous Images

Zuowei Zhang, Chuanqi Liu, Fan Hao, Zhunga Liu

科研成果: 期刊稿件文章同行评审

1 引用 (Scopus)

摘要

Heterogeneous images are captured through different wavelength bands, providing rich and complementary information for change detection (CD), and domain transformation has emerged as a popular and effective solution. However, existing domain transformation-based CD methods overly rely on the quality of reconstructed features, making them appear inadequate for practical complex scenarios. In this paper, we propose a Style Transfer-based CD (STCD) method through unsupervised learning. STCD improves the quality and is robust to the reconstructed images by simultaneously employing a cautious labeling strategy and classifying. Specifically, we initially converted the two heterogeneous images provided into a shared domain by constructing a convolutional autoencoder based on adaptive instance normalization, which could improve the quality of reconstructed features and mitigate the data heterogeneity. Furthermore, we extract some significant pixel pairs based on fuzzy local information c-means to reduce the over-reliance on reconstructed features. Then we propose a Dynamic Superpixel-based Label Assignment (DSLA) rule to increase the reliable pseudo-labels employed in training a binary classifier. Finally, STCD can obtain great CD results even with poor reconstruction quality. Experimental results conducted on four heterogeneous datasets have demonstrated the effectiveness of STCD over other related CD methods.

指纹

探究 'Style Transfer-Based Unsupervised Change Detection from Heterogeneous Images' 的科研主题。它们共同构成独一无二的指纹。

引用此