TY - JOUR
T1 - Double-Structured Sparsity Guided Flexible Embedding Learning for Unsupervised Feature Selection
AU - Guo, Yu
AU - Sun, Yuan
AU - Wang, Zheng
AU - Nie, Feiping
AU - Wang, Fei
N1 - Publisher Copyright:
© 2012 IEEE.
PY - 2024
Y1 - 2024
N2 - In this article, we propose a novel unsupervised feature selection model combined with clustering, named double-structured sparsity guided flexible embedding learning (DSFEL) for unsupervised feature selection. DSFEL includes a module for learning a block-diagonal structural sparse graph that represents the clustering structure and another module for learning a completely row-sparse projection matrix using the ℓ2,0 -norm constraint to select distinctive features. Compared with the commonly used ℓ2,1-norm regularization term, the ℓ 2,0-norm constraint can avoid the drawbacks of sparsity limitation and parameter tuning. The optimization of the ℓ 2,0-norm constraint problem, which is a nonconvex and nonsmooth problem, is a formidable challenge, and previous optimization algorithms have only been able to provide approximate solutions. In order to address this issue, this article proposes an efficient optimization strategy that yields a closed-form solution. Eventually, through comprehensive experimentation on nine real-world datasets, it is demonstrated that the proposed method outperforms existing state-of-the-art unsupervised feature selection methods.
AB - In this article, we propose a novel unsupervised feature selection model combined with clustering, named double-structured sparsity guided flexible embedding learning (DSFEL) for unsupervised feature selection. DSFEL includes a module for learning a block-diagonal structural sparse graph that represents the clustering structure and another module for learning a completely row-sparse projection matrix using the ℓ2,0 -norm constraint to select distinctive features. Compared with the commonly used ℓ2,1-norm regularization term, the ℓ 2,0-norm constraint can avoid the drawbacks of sparsity limitation and parameter tuning. The optimization of the ℓ 2,0-norm constraint problem, which is a nonconvex and nonsmooth problem, is a formidable challenge, and previous optimization algorithms have only been able to provide approximate solutions. In order to address this issue, this article proposes an efficient optimization strategy that yields a closed-form solution. Eventually, through comprehensive experimentation on nine real-world datasets, it is demonstrated that the proposed method outperforms existing state-of-the-art unsupervised feature selection methods.
KW - block-diagonal structural sparse graph learning
KW - structural row-sparsity subspace learning
KW - unsupervised feature selection
KW - ℓ-norm constraint optimization
UR - http://www.scopus.com/inward/record.url?scp=85159826424&partnerID=8YFLogxK
U2 - 10.1109/TNNLS.2023.3267184
DO - 10.1109/TNNLS.2023.3267184
M3 - 文章
C2 - 37167052
AN - SCOPUS:85159826424
SN - 2162-237X
VL - 35
SP - 13354
EP - 13367
JO - IEEE Transactions on Neural Networks and Learning Systems
JF - IEEE Transactions on Neural Networks and Learning Systems
IS - 10
ER -