Learning Feature-Sparse Principal Subspace

Feiping Nie, Lai Tian, Rong Wang, Xuelong Li

Research output: Contribution to journalArticlepeer-review

3 Scopus citations

Abstract

The principal subspace estimation is directly connected to dimension reduction and is important when there is more than one principal component of interest. In this article, we introduce two new algorithms to solve the feature-sparsity constrained PCA problem (FSPCA) for the principal subspace estimation task, which performs feature selection and PCA simultaneously. Existing optimization methods for FSPCA require data distribution assumptions and are lack of global convergence guarantee. Though the general FSPCA problem is NP-hard, we show that, for a low-rank covariance, FSPCA can be solved globally (Algorithm 1). Then, we propose another strategy (Algorithm 2) to solve FSPCA for the general covariance by iteratively building a carefully designed proxy. We prove (data-dependent) approximation bound and regular stationary convergence guarantees for the new algorithms. For the spectrum of covariance with exponential/Zipf's distribution, we provide exponential/posynomial approximation bounds. Constructive examples and numerical results are provided to demonstrate the tightness of our results. Experimental results show the promising performance and efficiency of the new algorithms compared with the state-of-the-arts on both synthetic and real-world datasets.

Original languageEnglish
Pages (from-to)4858-4869
Number of pages12
JournalIEEE Transactions on Pattern Analysis and Machine Intelligence
Volume45
Issue number4
DOIs
StatePublished - 1 Apr 2023

Keywords

  • approximation algorithms
  • Feature selection
  • nonconvex optimization
  • nonsmooth optimization
  • sparse PCA

Fingerprint

Dive into the research topics of 'Learning Feature-Sparse Principal Subspace'. Together they form a unique fingerprint.

Cite this