TY - JOUR
T1 - Unsupervised Discriminative Feature Selection With l2,0-Norm Constrained Sparse Projection
AU - Nie, Feiping
AU - Dong, Xia
AU - Tian, Lai
AU - Wang, Rong
AU - Li, Xuelong
N1 - Publisher Copyright:
© 1979-2012 IEEE.
PY - 2025
Y1 - 2025
N2 - Feature selection plays an important role in a wide spectrum of applications. Most of the sparsity-based feature selection methods tend to solve the relaxed l2,p-norm (0 ≤ p ≤ 1) regularized problem, leading to the output of a sub-optimal feature subset and the heavy work of tuning regularization parameters. Optimizing the non-convex l2,0-norm constrained problem is still an open question. Existing optimization algorithms used to solve the l2,0-norm constrained problem require specific data distribution assumptions and cannot guarantee global convergence. In this article, we propose an unsupervised discriminative feature selection method with l2,0-norm constrained sparse projection (SPDFS) to address the above issues. To this end, fuzzy membership degree learning and l2,0-norm constrained projection learning are simultaneously performed to learn a feature-sparse projection for discriminative feature selection. More importantly, two optimization strategies are developed to optimize the proposed NP-hard problem. Specifically, a non-iterative algorithm with globally optimal solution is derived for a special case, and an iterative algorithm with both rigorous ascend property and approximation guarantee is designed for the general case. Experimental results on both toy and real-world datasets demonstrate the superiority of the proposed method over some state-of-the-art methods in data clustering and text classification tasks.
AB - Feature selection plays an important role in a wide spectrum of applications. Most of the sparsity-based feature selection methods tend to solve the relaxed l2,p-norm (0 ≤ p ≤ 1) regularized problem, leading to the output of a sub-optimal feature subset and the heavy work of tuning regularization parameters. Optimizing the non-convex l2,0-norm constrained problem is still an open question. Existing optimization algorithms used to solve the l2,0-norm constrained problem require specific data distribution assumptions and cannot guarantee global convergence. In this article, we propose an unsupervised discriminative feature selection method with l2,0-norm constrained sparse projection (SPDFS) to address the above issues. To this end, fuzzy membership degree learning and l2,0-norm constrained projection learning are simultaneously performed to learn a feature-sparse projection for discriminative feature selection. More importantly, two optimization strategies are developed to optimize the proposed NP-hard problem. Specifically, a non-iterative algorithm with globally optimal solution is derived for a special case, and an iterative algorithm with both rigorous ascend property and approximation guarantee is designed for the general case. Experimental results on both toy and real-world datasets demonstrate the superiority of the proposed method over some state-of-the-art methods in data clustering and text classification tasks.
KW - data clustering
KW - Discriminative feature selection
KW - fuzziness
KW - linear discriminant analysis
KW - sparse projection
KW - text classification
KW - unsupervised learning
KW - ℓ-norm
UR - http://www.scopus.com/inward/record.url?scp=105008913888&partnerID=8YFLogxK
U2 - 10.1109/TPAMI.2025.3580669
DO - 10.1109/TPAMI.2025.3580669
M3 - 文章
AN - SCOPUS:105008913888
SN - 0162-8828
JO - IEEE Transactions on Pattern Analysis and Machine Intelligence
JF - IEEE Transactions on Pattern Analysis and Machine Intelligence
ER -