Feature selection via joint embedding learning and sparse regression

Chenping Hou, Feiping Nie, Dongyun Yi, Yi Wu

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

180 Scopus citations

Abstract

The problem of feature selection has aroused considerable research interests in the past few years. Traditional learning based feature selection methods separate embedding learning and feature ranking. In this paper, we introduce a novel unsupervised feature selection approach via Joint Embedding Learning and Sparse Regression (JELSR). Instead of simply employing the graph laplacian for embedding learning and then regression, we use the weight via locally linear approximation to construct graph and unify embedding learning and sparse regression to perform feature selection. By adding the ℓ2,1-norm regularization, we can learn a sparse matrix for feature ranking. We also provide an effective method to solve the proposed problem. Compared with traditional unsupervised feature selection methods, our approach could integrate the merits of embedding learning and sparse regression simultaneously. Plenty of experimental results are provided to show the validity.

Original languageEnglish
Title of host publicationIJCAI 2011 - 22nd International Joint Conference on Artificial Intelligence
Pages1324-1329
Number of pages6
DOIs
StatePublished - 2011
Externally publishedYes
Event22nd International Joint Conference on Artificial Intelligence, IJCAI 2011 - Barcelona, Catalonia, Spain
Duration: 16 Jul 201122 Jul 2011

Publication series

NameIJCAI International Joint Conference on Artificial Intelligence
ISSN (Print)1045-0823

Conference

Conference22nd International Joint Conference on Artificial Intelligence, IJCAI 2011
Country/TerritorySpain
CityBarcelona, Catalonia
Period16/07/1122/07/11

Fingerprint

Dive into the research topics of 'Feature selection via joint embedding learning and sparse regression'. Together they form a unique fingerprint.

Cite this