Exact top-k feature selection via ℓ2;0-norm constraint

Xiao Cai, Feiping Nie, Heng Huang

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

160 Scopus citations

Abstract

In this paper, we propose a novel robust and pragmatic feature selection approach. Unlike those sparse learning based feature selection methods which tackle the approximate problem by imposing sparsity regularization in the objective function, the proposed method only has one ℓ2;0-norm loss term with an explicit ℓ2;0-Norm equality constraint. An efficient algorithm based on augmented Lagrangian method will be derived to solve the above constrained optimization problem to find out the stable local solution. Extensive experiments on four biological datasets show that although our proposed model is not a convex problem, it outperforms the approximate convex counterparts and state-ofart feature selection methods evaluated in terms of classification accuracy by two popular classifiers. What is more, since the regularization parameter of our method has the explicit meaning, i.e. The number of feature selected, it avoids the burden of tuning the parameter, making it a pragmatic feature selection method.

Original languageEnglish
Title of host publicationIJCAI 2013 - Proceedings of the 23rd International Joint Conference on Artificial Intelligence
Pages1240-1246
Number of pages7
StatePublished - 2013
Externally publishedYes
Event23rd International Joint Conference on Artificial Intelligence, IJCAI 2013 - Beijing, China
Duration: 3 Aug 20139 Aug 2013

Publication series

NameIJCAI International Joint Conference on Artificial Intelligence
ISSN (Print)1045-0823

Conference

Conference23rd International Joint Conference on Artificial Intelligence, IJCAI 2013
Country/TerritoryChina
CityBeijing
Period3/08/139/08/13

Fingerprint

Dive into the research topics of 'Exact top-k feature selection via ℓ2;0-norm constraint'. Together they form a unique fingerprint.

Cite this