Implicit Weight Learning for Multi-View Clustering

Feiping Nie, Shaojun Shi, Jing Li, Xuelong Li

Research output: Contribution to journalArticlepeer-review

27 Scopus citations

Abstract

Exploiting different representations, or views, of the same object for better clustering has become very popular these days, which is conventionally called multi-view clustering. In general, it is essential to measure the importance of each individual view, due to some noises, or inherent capacities in the description. Many previous works model the view importance as weight, which is simple but effective empirically. In this article, instead of following the traditional thoughts, we propose a new weight learning paradigm in the context of multi-view clustering in virtue of the idea of the reweighted approach, and we theoretically analyze its working mechanism. Meanwhile, as a carefully achieved example, all of the views are connected by exploring a unified Laplacian rank constrained graph, which will be a representative method to compare with other weight learning approaches in experiments. Furthermore, the proposed weight learning strategy is much suitable for multi-view data, and it can be naturally integrated with many existing clustering learners. According to the numerical experiments, the proposed implicit weight learning approach is proven effective and practical to use in multi-view clustering.

Original languageEnglish
Pages (from-to)4223-4236
Number of pages14
JournalIEEE Transactions on Neural Networks and Learning Systems
Volume34
Issue number8
DOIs
StatePublished - 1 Aug 2023

Keywords

  • Graph-based clustering
  • multi-view clustering
  • rank constraint
  • weight learning

Fingerprint

Dive into the research topics of 'Implicit Weight Learning for Multi-View Clustering'. Together they form a unique fingerprint.

Cite this