Unsupervised Subspace Learning With Flexible Neighboring

Weizhong Yu, Jintang Bian, Feiping Nie, Rong Wang, Xuelong Li

Research output: Contribution to journalArticlepeer-review

6 Scopus citations

Abstract

Graph-based subspace learning has been widely used in various applications as the rapid growth of data dimension, while the graph is constructed by affinity matrix of input data. However, it is difficult for these subspace learning methods to preserve the intrinsic local structure of data with the high-dimensional noise. To address this problem, we proposed a novel unsupervised dimensionality reduction approach named unsupervised subspace learning with flexible neighboring (USFN). We learn a similarity graph by adaptive probabilistic neighborhood learning process to preserve the manifold structure of high-dimensional data. In addition, we utilize the flexible neighboring to learn projection and latent representation of manifold structure of high-dimensional data to remove the impact of noise. The adaptive similarity graph and latent representation are jointly learned by integrating adaptive probabilistic neighborhood learning and manifold residue term into a unified objection function. The experimental results on synthetic and real-world datasets demonstrate the performance of the proposed unsupervised subspace learning USFN method.

Original languageEnglish
Pages (from-to)2043-2056
Number of pages14
JournalIEEE Transactions on Neural Networks and Learning Systems
Volume34
Issue number4
DOIs
StatePublished - 1 Apr 2023

Keywords

  • Flexible neighboring
  • similarity graph
  • subspace learning
  • unsupervised learning

Fingerprint

Dive into the research topics of 'Unsupervised Subspace Learning With Flexible Neighboring'. Together they form a unique fingerprint.

Cite this