TY - JOUR
T1 - Simultaneous local clustering and unsupervised feature selection via strong space constraint
AU - Wang, Zheng
AU - Li, Qi
AU - Zhao, Haifeng
AU - Nie, Feiping
N1 - Publisher Copyright:
© 2023 Elsevier Ltd
PY - 2023/10
Y1 - 2023/10
N2 - Clustering is a fashion method applied in machine learning tasks. However, high dimensional data brings many obstacles for clustering approaches. To address such a problem, the unsupervised feature selection (UFS) method can be incorporated into clustering to reduce dimensionality. In general, most of the UFS methods adopt ℓ2,1-norm for subspace sparsity learning. However, its sparsity highly relies on the setting of trade-off parameter, which may lead to instability of ranking results and the difficulty in obtaining the optimal solution of projection matrix. In this paper, we propose to directly learn an absolutely row-sparsity subspace via the ℓ2,0-norm constraint, called Sparse constraint and Local learning for Unsupervised Feature Selection (SLUFS). It is an ideal sparse subspace constraint which can overcome the drawbacks of the ℓ2,1-norm. However, optimizing the ℓ2,0-norm constraint is an NP-hard problem, and at present, only some approximate solutions can be given, but the convergence can not be guaranteed. To tackle this challenge, we design a novel alternative iterative algorithm to directly optimize the ℓ2,0-norm based model. Most importantly, our strategy can obtain a closed-form solution with strict convergence guarantee. Comprehensive experiments are conducted on several real-world datasets to evaluate the performance of SLUFS with comparison to several related state-of-the-art methods.
AB - Clustering is a fashion method applied in machine learning tasks. However, high dimensional data brings many obstacles for clustering approaches. To address such a problem, the unsupervised feature selection (UFS) method can be incorporated into clustering to reduce dimensionality. In general, most of the UFS methods adopt ℓ2,1-norm for subspace sparsity learning. However, its sparsity highly relies on the setting of trade-off parameter, which may lead to instability of ranking results and the difficulty in obtaining the optimal solution of projection matrix. In this paper, we propose to directly learn an absolutely row-sparsity subspace via the ℓ2,0-norm constraint, called Sparse constraint and Local learning for Unsupervised Feature Selection (SLUFS). It is an ideal sparse subspace constraint which can overcome the drawbacks of the ℓ2,1-norm. However, optimizing the ℓ2,0-norm constraint is an NP-hard problem, and at present, only some approximate solutions can be given, but the convergence can not be guaranteed. To tackle this challenge, we design a novel alternative iterative algorithm to directly optimize the ℓ2,0-norm based model. Most importantly, our strategy can obtain a closed-form solution with strict convergence guarantee. Comprehensive experiments are conducted on several real-world datasets to evaluate the performance of SLUFS with comparison to several related state-of-the-art methods.
KW - Local structure learning
KW - Unsupervised feature selection
KW - ℓ-Norm constraint optimization
UR - http://www.scopus.com/inward/record.url?scp=85160754223&partnerID=8YFLogxK
U2 - 10.1016/j.patcog.2023.109718
DO - 10.1016/j.patcog.2023.109718
M3 - 文章
AN - SCOPUS:85160754223
SN - 0031-3203
VL - 142
JO - Pattern Recognition
JF - Pattern Recognition
M1 - 109718
ER -