Feature selection under regularized orthogonal least square regression with optimal scaling

Rui Zhang, Feiping Nie, Xuelong Li

Research output: Contribution to journalArticlepeer-review

29 Scopus citations

Abstract

Due to lack of scale change in orthogonal least square regression (OLSR), the scaling term is introduced to OLSR to build up a novel orthogonal least square regression with optimal scaling (OLSR-OS) problem in this paper. In addition, the proposed OLSR-OS problem is proven to be numerically better than the OLSR problem. In order to select relevant features under the proposed OLSR-OS problem, ℓ2, 1-norm regularization is further introduced, such that row-sparse projection is achieved. Accordingly, a novel parameterized expansion balanced feature selection (PEB-FS) method is derived based on an extension balanced counterpart. Moreover, not only the convergence of the proposed PEB-FS method is provided but the optimal scaling can be automatically achieved as well. Consequently, the effectiveness and the superiority of the proposed PEB-FS method are verified both theoretically and experimentally.

Original languageEnglish
Pages (from-to)547-553
Number of pages7
JournalNeurocomputing
Volume273
DOIs
StatePublished - 17 Jan 2018

Keywords

  • Feature selection
  • Optimal scaling
  • Orthogonal least square regression
  • Sparse projection

Fingerprint

Dive into the research topics of 'Feature selection under regularized orthogonal least square regression with optimal scaling'. Together they form a unique fingerprint.

Cite this