Unsupervised maximum margin feature selection via L 2,1-norm minimization

Shizhun Yang, Chenping Hou, Feiping Nie, Yi Wu

Research output: Contribution to journalArticlepeer-review

32 Scopus citations

Abstract

In this article, we present an unsupervised maximum margin feature selection algorithm via sparse constraints. The algorithm combines feature selection and K-means clustering into a coherent framework. L 2,1-norm regularization is performed to the transformation matrix to enable feature selection across all data samples. Our method is equivalent to solving a convex optimization problem and is an iterative algorithm that converges to an optimal solution. The convergence analysis of our algorithm is also provided. Experimental results demonstrate the efficiency of our algorithm.

Original languageEnglish
Pages (from-to)1791-1799
Number of pages9
JournalNeural Computing and Applications
Volume21
Issue number7
DOIs
StatePublished - Oct 2012
Externally publishedYes

Keywords

  • Feature selection
  • K-means clustering
  • Maximum margin criterion
  • Regularization

Fingerprint

Dive into the research topics of 'Unsupervised maximum margin feature selection via L 2,1-norm minimization'. Together they form a unique fingerprint.

Cite this