TY - GEN
T1 - Multi-View Subspace Clustering With Consensus Graph Contrastive Learning
AU - Zhang, Jie
AU - Sun, Yuan
AU - Guo, Yu
AU - Wang, Zheng
AU - Nie, Feiping
AU - Wang, Fei
N1 - Publisher Copyright:
© 2024 IEEE.
PY - 2024
Y1 - 2024
N2 - A significant challenge in multi-view clustering lies in the comprehensive extraction of consistency and complementary information from heterogeneous multi-view data. Numerous methods employ contrastive learning techniques to explore the information between views. However, the basic contrastive learning strategy does not consider cluster information when constructing sample pairs, potentially leading to the emergence of false negative pairs (FNPs). To tackle this concern, we propose a Multi-view Subspace Clustering with Consensus Graph Contrastive Learning (CGCL) model. Specifically, a self-representation layer is designed to acquire a consensus graph that elucidates the overall data distribution. Furthermore, a contrastive learning layer utilizes the cluster information embedded in the consensus graph to yield reliable sample pairs, resulting in a reduction of the detrimental FNPs and the extraction of complementary information from the various views. Extensive experiments on public datasets demonstrate the effectiveness of CGCL.
AB - A significant challenge in multi-view clustering lies in the comprehensive extraction of consistency and complementary information from heterogeneous multi-view data. Numerous methods employ contrastive learning techniques to explore the information between views. However, the basic contrastive learning strategy does not consider cluster information when constructing sample pairs, potentially leading to the emergence of false negative pairs (FNPs). To tackle this concern, we propose a Multi-view Subspace Clustering with Consensus Graph Contrastive Learning (CGCL) model. Specifically, a self-representation layer is designed to acquire a consensus graph that elucidates the overall data distribution. Furthermore, a contrastive learning layer utilizes the cluster information embedded in the consensus graph to yield reliable sample pairs, resulting in a reduction of the detrimental FNPs and the extraction of complementary information from the various views. Extensive experiments on public datasets demonstrate the effectiveness of CGCL.
KW - contrastive learning
KW - multi-view clustering
KW - self-representation learning
KW - unsupervised learning
UR - http://www.scopus.com/inward/record.url?scp=85211578001&partnerID=8YFLogxK
U2 - 10.1109/ICASSP48485.2024.10446405
DO - 10.1109/ICASSP48485.2024.10446405
M3 - 会议稿件
AN - SCOPUS:85211578001
T3 - ICASSP, IEEE International Conference on Acoustics, Speech and Signal Processing - Proceedings
SP - 6340
EP - 6344
BT - ICASSP, IEEE International Conference on Acoustics, Speech and Signal Processing - Proceedings
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 2024 IEEE International Conference on Acoustics, Speech, and Signal Processing, ICASSP 2024
Y2 - 14 April 2024 through 19 April 2024
ER -