TY - GEN
T1 - Self-Supervised Localized Topology Consistency for Noise-Robust Hyperspectral Image Classification
AU - Wang, Jie
AU - Tang, Liaoyuan
AU - He, Guanxiong
AU - Cao, Zhe
AU - Wang, Zheng
AU - Wang, Rong
N1 - Publisher Copyright:
© 2025 IEEE.
PY - 2025
Y1 - 2025
N2 - Label noise in hyperspectral image classification (HIC) can severely degrade model performance by leading to incorrect predictions and overfitting, especially as erroneous labels propagate and compound throughout the training process. To address this, we propose a robust learning framework called Self-Supervised Localized Topology Consistency (SSLTC), which enforces local topology consistency to enhance model resilience against noisy labels. SSLTC captures local topology via a graph-based representation, where nodes represent samples and edges encode pairwise similarities. Predictions are propagated from topologically similar nodes to central nodes, constrained by Kullback-Leibler (KL) divergence to encourage consistent predictions and reduce sensitivity to noisy labels. Additionally, a self-supervised contrastive learning strategy is used to refine spectral-spatial representations in an unsupervised manner, further improving robustness. Extensive experiments on hyperspectral benchmark datasets with varying noise levels demonstrate the superiority of SSLTC in mitigating the adverse effects of label noise compared to state-of-the-art approaches in HIC tasks.
AB - Label noise in hyperspectral image classification (HIC) can severely degrade model performance by leading to incorrect predictions and overfitting, especially as erroneous labels propagate and compound throughout the training process. To address this, we propose a robust learning framework called Self-Supervised Localized Topology Consistency (SSLTC), which enforces local topology consistency to enhance model resilience against noisy labels. SSLTC captures local topology via a graph-based representation, where nodes represent samples and edges encode pairwise similarities. Predictions are propagated from topologically similar nodes to central nodes, constrained by Kullback-Leibler (KL) divergence to encourage consistent predictions and reduce sensitivity to noisy labels. Additionally, a self-supervised contrastive learning strategy is used to refine spectral-spatial representations in an unsupervised manner, further improving robustness. Extensive experiments on hyperspectral benchmark datasets with varying noise levels demonstrate the superiority of SSLTC in mitigating the adverse effects of label noise compared to state-of-the-art approaches in HIC tasks.
KW - contrastive learning
KW - Hyperspectral image classification (HIC)
KW - localized topology consistency
KW - noisy labels
UR - http://www.scopus.com/inward/record.url?scp=105003867304&partnerID=8YFLogxK
U2 - 10.1109/ICASSP49660.2025.10887777
DO - 10.1109/ICASSP49660.2025.10887777
M3 - 会议稿件
AN - SCOPUS:105003867304
T3 - ICASSP, IEEE International Conference on Acoustics, Speech and Signal Processing - Proceedings
BT - 2025 IEEE International Conference on Acoustics, Speech, and Signal Processing, ICASSP 2025 - Proceedings
A2 - Rao, Bhaskar D
A2 - Trancoso, Isabel
A2 - Sharma, Gaurav
A2 - Mehta, Neelesh B.
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 2025 IEEE International Conference on Acoustics, Speech, and Signal Processing, ICASSP 2025
Y2 - 6 April 2025 through 11 April 2025
ER -