Abstract
In recent decades, unsupervised feature selection methods have become increasingly popular. Nevertheless, most of the existing unsupervised feature selection methods suffer from two major problems that lead to suboptimal solutions. Many methods impose a hard linear projection constraint on original data, which is overly strict in nature and not suitable for dealing with data sampled from nonlinear manifolds. Second, most existing methods use ℓ2,p-norm (0< p ≤ 1) regularization on projection matrix to obtain row sparse matrix and then calculate scores of each feature, which would introduce extra parameter with a slight possibility to directly obtain indexes of discriminative features. To solve the above problems, we propose two novel unsupervised feature selection methods called SF2S and SF2SOG, which can simultaneously learn optimal flexible projections and obtain an orthogonal sparse projection to directly select discriminative features by applying ℓ2,0-norm constraint. Moreover, we propose to explore the local structure of flexible embedding through preserving the manifold structure of original data and adaptively constructing an optimal graph in subspace. Third, the novel iterative optimization algorithms are presented to solve objective functions guaranteeing convergence theoretically. Various evaluation experiments on synthetic and real-world datasets demonstrate the effectiveness and superiority of our proposed methods.
| Original language | English |
|---|---|
| Pages (from-to) | 6362-6375 |
| Number of pages | 14 |
| Journal | IEEE Transactions on Knowledge and Data Engineering |
| Volume | 35 |
| Issue number | 6 |
| DOIs | |
| State | Published - 1 Jun 2023 |
Keywords
- Feature selection
- flexible projection
- optimal graph
- unsupervised learning
- ℓ-norm
Fingerprint
Dive into the research topics of 'Sparse and Flexible Projections for Unsupervised Feature Selection'. Together they form a unique fingerprint.Cite this
- APA
- Author
- BIBTEX
- Harvard
- Standard
- RIS
- Vancouver