TY - JOUR
T1 - Bidirectional Probabilistic Subspaces Approximation for Multiview Clustering
AU - Wu, Danyang
AU - Dong, Xia
AU - Cao, Jianfu
AU - Wang, Rong
AU - Nie, Feiping
AU - Li, Xuelong
N1 - Publisher Copyright:
© 2022 IEEE.
PY - 2024
Y1 - 2024
N2 - The existing multiview clustering models learn a consistent low-dimensional embedding either from multiple feature matrices or multiple similarity matrices, which ignores the interaction between the two procedures and limits the improvement of clustering performance on multiview data. To address this issue, a bidirectional probabilistic subspaces approximation (BPSA) model is developed in this article to learn a consistently orthogonal embedding from multiple feature matrices and multiple similarity matrices simultaneously via the disturbed probabilistic subspace modeling and approximation. A skillful bidirectional fusion strategy is designed to guarantee the parameter-free property of the BPSA model. Two adaptively weighted learning mechanisms are introduced to ensure the inconsistencies among multiple views and the inconsistencies between bidirectional learning processes. To solve the optimization problem involved in the BPSA model, an iterative solver is derived, and a rigorous convergence guarantee is provided. Extensive experimental results on both toy and real-world datasets demonstrate that our BPSA model achieves state-of-the-art performance even if it is parameter-free.
AB - The existing multiview clustering models learn a consistent low-dimensional embedding either from multiple feature matrices or multiple similarity matrices, which ignores the interaction between the two procedures and limits the improvement of clustering performance on multiview data. To address this issue, a bidirectional probabilistic subspaces approximation (BPSA) model is developed in this article to learn a consistently orthogonal embedding from multiple feature matrices and multiple similarity matrices simultaneously via the disturbed probabilistic subspace modeling and approximation. A skillful bidirectional fusion strategy is designed to guarantee the parameter-free property of the BPSA model. Two adaptively weighted learning mechanisms are introduced to ensure the inconsistencies among multiple views and the inconsistencies between bidirectional learning processes. To solve the optimization problem involved in the BPSA model, an iterative solver is derived, and a rigorous convergence guarantee is provided. Extensive experimental results on both toy and real-world datasets demonstrate that our BPSA model achieves state-of-the-art performance even if it is parameter-free.
KW - Adaptively weighted learning
KW - bidirectional fusion
KW - disturbed probabilistic subspace
KW - multiview clustering
KW - parameter-free model
UR - http://www.scopus.com/inward/record.url?scp=85142793529&partnerID=8YFLogxK
U2 - 10.1109/TNNLS.2022.3217032
DO - 10.1109/TNNLS.2022.3217032
M3 - 文章
C2 - 36383582
AN - SCOPUS:85142793529
SN - 2162-237X
VL - 35
SP - 8939
EP - 8953
JO - IEEE Transactions on Neural Networks and Learning Systems
JF - IEEE Transactions on Neural Networks and Learning Systems
IS - 7
ER -