TY - JOUR
T1 - Mutual-Taught Deep Clustering
AU - Hu, Zhanxuan
AU - Wang, Yichen
AU - Ning, Hailong
AU - Wu, Danyang
AU - Nie, Feiping
N1 - Publisher Copyright:
© 2023 Elsevier B.V.
PY - 2023/12/20
Y1 - 2023/12/20
N2 - Deep clustering seeks to group data into distinct clusters using deep learning techniques. Existing approaches of deep clustering can be broadly categorized into two groups: offline clustering based on unsupervised representation learning and online clustering based on unsupervised classification. While both groups have demonstrated impressive performance in deep clustering, no study has explored the integration of their respective strengths. To this end, we propose Mutual-Taught Deep Clustering (MTDC), which unifies unsupervised representation learning and unsupervised classification into a framework while realizing mutual promotion using a novel mutual-taught mechanism. Specifically, MTDC alternates between predicting pseudolabels in label space and estimating semantic similarity in feature space during training. Moreover, pseudolabels provide weakly-supervised information to enhance unsupervised representation learning, while semantic similarities function as structural priors that regularize unsupervised classification. Consequently, unsupervised classification and unsupervised representation learning can mutually benefit from one another. MTDC is decoupled from prevailing deep clustering methods. For the sake of clarity, we build upon a straightforward baseline in this paper. Despite its simplicity, we demonstrate that MTDC is exceedingly efficacious and consistently enhances the baseline results by substantial margins. For example, MTDC achieves 2.5%∼7.9% (NMI), 3.0%∼13.9% (ACC), and 3.1%∼16.7% (ARI) gains over the baseline on six widely used image datasets. Source code is available at:https://github.com/yichenwang231/MTDC.
AB - Deep clustering seeks to group data into distinct clusters using deep learning techniques. Existing approaches of deep clustering can be broadly categorized into two groups: offline clustering based on unsupervised representation learning and online clustering based on unsupervised classification. While both groups have demonstrated impressive performance in deep clustering, no study has explored the integration of their respective strengths. To this end, we propose Mutual-Taught Deep Clustering (MTDC), which unifies unsupervised representation learning and unsupervised classification into a framework while realizing mutual promotion using a novel mutual-taught mechanism. Specifically, MTDC alternates between predicting pseudolabels in label space and estimating semantic similarity in feature space during training. Moreover, pseudolabels provide weakly-supervised information to enhance unsupervised representation learning, while semantic similarities function as structural priors that regularize unsupervised classification. Consequently, unsupervised classification and unsupervised representation learning can mutually benefit from one another. MTDC is decoupled from prevailing deep clustering methods. For the sake of clarity, we build upon a straightforward baseline in this paper. Despite its simplicity, we demonstrate that MTDC is exceedingly efficacious and consistently enhances the baseline results by substantial margins. For example, MTDC achieves 2.5%∼7.9% (NMI), 3.0%∼13.9% (ACC), and 3.1%∼16.7% (ARI) gains over the baseline on six widely used image datasets. Source code is available at:https://github.com/yichenwang231/MTDC.
KW - Clustering
KW - Representation learning
KW - Unsupervised learning
UR - http://www.scopus.com/inward/record.url?scp=85174721430&partnerID=8YFLogxK
U2 - 10.1016/j.knosys.2023.111100
DO - 10.1016/j.knosys.2023.111100
M3 - 文章
AN - SCOPUS:85174721430
SN - 0950-7051
VL - 282
JO - Knowledge-Based Systems
JF - Knowledge-Based Systems
M1 - 111100
ER -