TY - JOUR
T1 - Adaptive sparse contrastive learning for unsupervised object re-identification
AU - Zheng, Dingyuan
AU - Liu, Yang
AU - Zhou, Deyun
AU - Xiao, Jimin
AU - Zhang, Bingfeng
AU - Chen, Lin
N1 - Publisher Copyright:
© 2025 Elsevier Ltd
PY - 2026/4
Y1 - 2026/4
N2 - Clustering-based contrastive learning offers a promising solution for unsupervised object re-identification but is hindered by challenges like the presence of ambiguous and redundant contrastive pairs generated from noisy pseudo labels and dense contrastive mechanisms. These issues mislead training and distract the learning focus, impeding the extraction of discriminative features. The efficiency-reliability trade-off commonly encountered in unsupervised learning also compromises representation learning. To address these drawbacks, we introduce a novel paradigm, dubbed adaptive sparse contrastive learning (ASCL), that sparsely constructs informative contrastive pairs for each pseudo-class in a mini-batch, rather than establishing dense pairs for every instance. To mitigate the efficiency-reliability trade-off, we propose two complementary sparse contrastive learning objectives: the composite sparse contrastive loss (CSCL) and the weighted sparse contrastive loss (WSCL). These objectives adaptively modulate the pulling strength within classes based on the changing intra-cluster discrepancies, balancing aggressive exploration and conservatism, thus facilitating the exploration of fine-grained and informative identity cues. Our proposed ASCL enhances representation learning without additional overhead. Extensive experiments on three large-scale object re-identification datasets demonstrate the superiority of the proposed ASCL. In particular, ASCL achieves new state-of-the-art results on the challenging MSMT17 dataset, with 46.6 % mAP and 76.4 % top-1 accuracy. Codes will be available at: https://github.com/Dingyuan-Zheng/ASCL.
AB - Clustering-based contrastive learning offers a promising solution for unsupervised object re-identification but is hindered by challenges like the presence of ambiguous and redundant contrastive pairs generated from noisy pseudo labels and dense contrastive mechanisms. These issues mislead training and distract the learning focus, impeding the extraction of discriminative features. The efficiency-reliability trade-off commonly encountered in unsupervised learning also compromises representation learning. To address these drawbacks, we introduce a novel paradigm, dubbed adaptive sparse contrastive learning (ASCL), that sparsely constructs informative contrastive pairs for each pseudo-class in a mini-batch, rather than establishing dense pairs for every instance. To mitigate the efficiency-reliability trade-off, we propose two complementary sparse contrastive learning objectives: the composite sparse contrastive loss (CSCL) and the weighted sparse contrastive loss (WSCL). These objectives adaptively modulate the pulling strength within classes based on the changing intra-cluster discrepancies, balancing aggressive exploration and conservatism, thus facilitating the exploration of fine-grained and informative identity cues. Our proposed ASCL enhances representation learning without additional overhead. Extensive experiments on three large-scale object re-identification datasets demonstrate the superiority of the proposed ASCL. In particular, ASCL achieves new state-of-the-art results on the challenging MSMT17 dataset, with 46.6 % mAP and 76.4 % top-1 accuracy. Codes will be available at: https://github.com/Dingyuan-Zheng/ASCL.
KW - Clustering methods
KW - Object re-identification
KW - Sparse contrastive learning
UR - https://www.scopus.com/pages/publications/105019184589
U2 - 10.1016/j.patcog.2025.112604
DO - 10.1016/j.patcog.2025.112604
M3 - 文章
AN - SCOPUS:105019184589
SN - 0031-3203
VL - 172
JO - Pattern Recognition
JF - Pattern Recognition
M1 - 112604
ER -