Feature selection at the discrete limit

Miao Zhang, Chris Ding, Ya Zhang, Feiping Nie

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

53 Scopus citations

Abstract

Feature selection plays an important role in many machine learning and data mining applications. In this paper, we propose to use L2,p norm for feature selection with emphasis on small p. As p → 0, feature selection becomes discrete feature selection problem. We provide two algorithms, proximal gradient algorithm and rank- one update algorithm, which is more efficient at large regularization λ. We provide closed form solutions of the proximal operator at p = 0,1/2. Experiments on real life datasets show that features selected at small p consistently outperform features selected at p = 1, the standard L2,1 approach and other popular feature selection methods.

Original languageEnglish
Title of host publicationProceedings of the National Conference on Artificial Intelligence
PublisherAI Access Foundation
Pages1355-1361
Number of pages7
ISBN (Electronic)9781577356783
StatePublished - 2014
Externally publishedYes
Event28th AAAI Conference on Artificial Intelligence, AAAI 2014, 26th Innovative Applications of Artificial Intelligence Conference, IAAI 2014 and the 5th Symposium on Educational Advances in Artificial Intelligence, EAAI 2014 - Quebec City, Canada
Duration: 27 Jul 201431 Jul 2014

Publication series

NameProceedings of the National Conference on Artificial Intelligence
Volume2

Conference

Conference28th AAAI Conference on Artificial Intelligence, AAAI 2014, 26th Innovative Applications of Artificial Intelligence Conference, IAAI 2014 and the 5th Symposium on Educational Advances in Artificial Intelligence, EAAI 2014
Country/TerritoryCanada
CityQuebec City
Period27/07/1431/07/14

Fingerprint

Dive into the research topics of 'Feature selection at the discrete limit'. Together they form a unique fingerprint.

Cite this