Learning feature sparse principal subspace

Lai Tian, Feiping Nie, Rong Wang, Xuelong Li

科研成果: 期刊稿件会议文章同行评审

18 引用 (Scopus)

摘要

This paper presents new algorithms to solve the feature-sparsity constrained PCA problem (FSPCA), which performs feature selection and PCA simultaneously. Existing optimization methods for FSPCA require data distribution assumptions and lack of global convergence guarantee. Though the general FSPCA problem is NP-hard, we show that, for a low-rank covariance, FSPCA can be solved globally (Algorithm 1). Then, we propose another strategy (Algorithm 2) to solve FSPCA for the general covariance by iteratively building a carefully designed proxy. We prove (data-dependent) approximation bound and convergence guarantees for the new algorithms. For the spectrum of covariance with exponential/Zipf’s distribution, we provide exponential/posynomial approximation bound. Experimental results show the promising performance and efficiency of the new algorithms compared with the state-of-the-arts on both synthetic and real-world datasets.

源语言英语
期刊Advances in Neural Information Processing Systems
2020-December
出版状态已出版 - 2020
活动34th Conference on Neural Information Processing Systems, NeurIPS 2020 - Virtual, Online
期限: 6 12月 202012 12月 2020

指纹

探究 'Learning feature sparse principal subspace' 的科研主题。它们共同构成独一无二的指纹。

引用此