TY - JOUR
T1 - Sparse and Flexible Projections for Unsupervised Feature Selection
AU - Wang, Rong
AU - Zhang, Canyu
AU - Bian, Jintang
AU - Wang, Zheng
AU - Nie, Feiping
AU - Li, Xuelong
N1 - Publisher Copyright:
© 1989-2012 IEEE.
PY - 2023/6/1
Y1 - 2023/6/1
N2 - In recent decades, unsupervised feature selection methods have become increasingly popular. Nevertheless, most of the existing unsupervised feature selection methods suffer from two major problems that lead to suboptimal solutions. Many methods impose a hard linear projection constraint on original data, which is overly strict in nature and not suitable for dealing with data sampled from nonlinear manifolds. Second, most existing methods use ℓ2,p-norm (0< p ≤ 1) regularization on projection matrix to obtain row sparse matrix and then calculate scores of each feature, which would introduce extra parameter with a slight possibility to directly obtain indexes of discriminative features. To solve the above problems, we propose two novel unsupervised feature selection methods called SF2S and SF2SOG, which can simultaneously learn optimal flexible projections and obtain an orthogonal sparse projection to directly select discriminative features by applying ℓ2,0-norm constraint. Moreover, we propose to explore the local structure of flexible embedding through preserving the manifold structure of original data and adaptively constructing an optimal graph in subspace. Third, the novel iterative optimization algorithms are presented to solve objective functions guaranteeing convergence theoretically. Various evaluation experiments on synthetic and real-world datasets demonstrate the effectiveness and superiority of our proposed methods.
AB - In recent decades, unsupervised feature selection methods have become increasingly popular. Nevertheless, most of the existing unsupervised feature selection methods suffer from two major problems that lead to suboptimal solutions. Many methods impose a hard linear projection constraint on original data, which is overly strict in nature and not suitable for dealing with data sampled from nonlinear manifolds. Second, most existing methods use ℓ2,p-norm (0< p ≤ 1) regularization on projection matrix to obtain row sparse matrix and then calculate scores of each feature, which would introduce extra parameter with a slight possibility to directly obtain indexes of discriminative features. To solve the above problems, we propose two novel unsupervised feature selection methods called SF2S and SF2SOG, which can simultaneously learn optimal flexible projections and obtain an orthogonal sparse projection to directly select discriminative features by applying ℓ2,0-norm constraint. Moreover, we propose to explore the local structure of flexible embedding through preserving the manifold structure of original data and adaptively constructing an optimal graph in subspace. Third, the novel iterative optimization algorithms are presented to solve objective functions guaranteeing convergence theoretically. Various evaluation experiments on synthetic and real-world datasets demonstrate the effectiveness and superiority of our proposed methods.
KW - Feature selection
KW - flexible projection
KW - optimal graph
KW - unsupervised learning
KW - ℓ-norm
UR - http://www.scopus.com/inward/record.url?scp=85128611348&partnerID=8YFLogxK
U2 - 10.1109/TKDE.2022.3167996
DO - 10.1109/TKDE.2022.3167996
M3 - 文章
AN - SCOPUS:85128611348
SN - 1041-4347
VL - 35
SP - 6362
EP - 6375
JO - IEEE Transactions on Knowledge and Data Engineering
JF - IEEE Transactions on Knowledge and Data Engineering
IS - 6
ER -