TY - JOUR
T1 - Subspace Sparse Discriminative Feature Selection
AU - Nie, Feiping
AU - Wang, Zheng
AU - Tian, Lai
AU - Wang, Rong
AU - Li, Xuelong
N1 - Publisher Copyright:
© 2013 IEEE.
PY - 2022/6/1
Y1 - 2022/6/1
N2 - In this article, we propose a novel feature selection approach via explicitly addressing the long-standing subspace sparsity issue. Leveraging ℓ2,1-norm regularization for feature selection is the major strategy in existing methods, which, however, confronts sparsity limitation and parameter-tuning trouble. To circumvent this problem, employing the ℓ2,0-norm constraint to improve the sparsity of the model has gained more attention recently whereas, optimizing the subspace sparsity constraint is still an unsolved problem, which only can acquire an approximate solution and without convergence proof. To address the above challenges, we innovatively propose a novel subspace sparsity discriminative feature selection (S2DFS) method which leverages a subspace sparsity constraint to avoid tuning parameters. In addition, the trace ratio formulated objective function extremely ensures the discriminability of selected features. Most important, an efficient iterative optimization algorithm is presented to explicitly solve the proposed problem with a closed-form solution and strict convergence proof. To the best of our knowledge, such an optimization algorithm of solving the subspace sparsity issue is first proposed in this article, and a general formulation of the optimization algorithm is provided for improving the extensibility and portability of our method. Extensive experiments conducted on several high-dimensional text and image datasets demonstrate that the proposed method outperforms related state-of-the-art methods in pattern classification and image retrieval tasks.
AB - In this article, we propose a novel feature selection approach via explicitly addressing the long-standing subspace sparsity issue. Leveraging ℓ2,1-norm regularization for feature selection is the major strategy in existing methods, which, however, confronts sparsity limitation and parameter-tuning trouble. To circumvent this problem, employing the ℓ2,0-norm constraint to improve the sparsity of the model has gained more attention recently whereas, optimizing the subspace sparsity constraint is still an unsolved problem, which only can acquire an approximate solution and without convergence proof. To address the above challenges, we innovatively propose a novel subspace sparsity discriminative feature selection (S2DFS) method which leverages a subspace sparsity constraint to avoid tuning parameters. In addition, the trace ratio formulated objective function extremely ensures the discriminability of selected features. Most important, an efficient iterative optimization algorithm is presented to explicitly solve the proposed problem with a closed-form solution and strict convergence proof. To the best of our knowledge, such an optimization algorithm of solving the subspace sparsity issue is first proposed in this article, and a general formulation of the optimization algorithm is provided for improving the extensibility and portability of our method. Extensive experiments conducted on several high-dimensional text and image datasets demonstrate that the proposed method outperforms related state-of-the-art methods in pattern classification and image retrieval tasks.
KW - Classification
KW - image retrieval
KW - subspace sparsity constraint optimization
KW - supervised feature selection
UR - http://www.scopus.com/inward/record.url?scp=85132454748&partnerID=8YFLogxK
U2 - 10.1109/TCYB.2020.3025205
DO - 10.1109/TCYB.2020.3025205
M3 - 文章
C2 - 33055053
AN - SCOPUS:85132454748
SN - 2168-2267
VL - 52
SP - 4221
EP - 4233
JO - IEEE Transactions on Cybernetics
JF - IEEE Transactions on Cybernetics
IS - 6
ER -