Abstract
In this article, we present an unsupervised maximum margin feature selection algorithm via sparse constraints. The algorithm combines feature selection and K-means clustering into a coherent framework. L 2,1-norm regularization is performed to the transformation matrix to enable feature selection across all data samples. Our method is equivalent to solving a convex optimization problem and is an iterative algorithm that converges to an optimal solution. The convergence analysis of our algorithm is also provided. Experimental results demonstrate the efficiency of our algorithm.
Original language | English |
---|---|
Pages (from-to) | 1791-1799 |
Number of pages | 9 |
Journal | Neural Computing and Applications |
Volume | 21 |
Issue number | 7 |
DOIs | |
State | Published - Oct 2012 |
Externally published | Yes |
Keywords
- Feature selection
- K-means clustering
- Maximum margin criterion
- Regularization