Bidirectional Probabilistic Subspaces Approximation for Multiview Clustering

Danyang Wu, Xia Dong, Jianfu Cao, Rong Wang, Feiping Nie, Xuelong Li

Research output: Contribution to journalArticlepeer-review

7 Scopus citations

Abstract

The existing multiview clustering models learn a consistent low-dimensional embedding either from multiple feature matrices or multiple similarity matrices, which ignores the interaction between the two procedures and limits the improvement of clustering performance on multiview data. To address this issue, a bidirectional probabilistic subspaces approximation (BPSA) model is developed in this article to learn a consistently orthogonal embedding from multiple feature matrices and multiple similarity matrices simultaneously via the disturbed probabilistic subspace modeling and approximation. A skillful bidirectional fusion strategy is designed to guarantee the parameter-free property of the BPSA model. Two adaptively weighted learning mechanisms are introduced to ensure the inconsistencies among multiple views and the inconsistencies between bidirectional learning processes. To solve the optimization problem involved in the BPSA model, an iterative solver is derived, and a rigorous convergence guarantee is provided. Extensive experimental results on both toy and real-world datasets demonstrate that our BPSA model achieves state-of-the-art performance even if it is parameter-free.

Original languageEnglish
Pages (from-to)8939-8953
Number of pages15
JournalIEEE Transactions on Neural Networks and Learning Systems
Volume35
Issue number7
DOIs
StatePublished - 2024

Keywords

  • Adaptively weighted learning
  • bidirectional fusion
  • disturbed probabilistic subspace
  • multiview clustering
  • parameter-free model

Fingerprint

Dive into the research topics of 'Bidirectional Probabilistic Subspaces Approximation for Multiview Clustering'. Together they form a unique fingerprint.

Cite this