Multi-view unsupervised dimensionality reduction with probabilistic neighbors

Qianyao Qiang, Bin Zhang, Fei Wang, Feiping Nie

Research output: Contribution to journalArticlepeer-review

8 Scopus citations

Abstract

Multi-view Unsupervised Dimensionality Reduction (MUDR) aims to find an optimal low-dimensional subspace to the original unlabeled high-dimensional multi-view data. Different views of multi-view data describe specific aspects that are independent or complementary with each other. How to make full use of the information from multiple views remains an open issue, especially in the absence of labels. In this study, we propose a novel model referred to as Multi-view unsupervised Dimensionality reduction with Probabilistic Neighbors (MDPN). It learns a multi-view consensus similarity matrix by adaptively weighing each view and properly assigning optimal neighbors for each sample. The consensus matrix has probability attributes and it encodes the probability of sample pairs to become neighbors. During the procedure, a projection matrix and the consensus matrix negotiate with each other to find the optimal subspace that preserves the structure/invariances of the original data. The projection matrix best serves dimensionality reduction. Moreover, considering the imbalance of dimensionality in multiple views, we extend this model to achieve flexible MUDR by projecting data from different views into subspaces of various dimensions. Two two-step iterative algorithms are developed to efficiently optimize the resultant objective problems. Extensive experimental results on synthetic and benchmark datasets demonstrate the superiorities of our models.

Original languageEnglish
Pages (from-to)203-216
Number of pages14
JournalNeurocomputing
Volume500
DOIs
StatePublished - 21 Aug 2022

Keywords

  • Consensus similarity matrix
  • Low-dimensional subspace
  • Multi-view unsupervised dimensionality reduction
  • Probabilistic neighbors

Fingerprint

Dive into the research topics of 'Multi-view unsupervised dimensionality reduction with probabilistic neighbors'. Together they form a unique fingerprint.

Cite this