Joint nonlinear feature selection and continuous values regression network

Zheng Wang, Feiping Nie, Canyu Zhang, Rong Wang, Xuelong Li

Research output: Contribution to journalArticlepeer-review

3 Scopus citations

Abstract

The curse of dimensionality is a long-standing intractable issue in many machine learning and computer vision tasks. Feature selection, a data preprocessing technique, aims to select most discriminative feature subsets for improving performance of downstream machine learning tasks. However, most existing feature selection methods concentrate upon learning a linear relationship between data points and their labels, which causes that they are incapable of handling nonlinear complex data in real-world applications. In this paper, we first propose a novel end-to-end Nonlinear Feature Selective Networks (NFSN) that be able to select discriminative feature subsets while preserving their nonlinear structure by embedding a ℓ2,p-norm regularized hidden layer into designed continuous values regression networks. In addition, we propose an efficient optimization algorithm that joins back propagation algorithm and re-weighted optimization strategy to acquire derivative of all weights accurately. Experimental results on the nonlinear analog pulse signal and real-world datasets demonstrate the superiority of proposed method compared to some related methods on feature selection. Our source code available on: https://github.com/StevenWangNPU/NFSN.

Original languageEnglish
Pages (from-to)197-206
Number of pages10
JournalPattern Recognition Letters
Volume150
DOIs
StatePublished - Oct 2021

Keywords

  • Continuous values regression
  • Nonlinear feature selection
  • Re-weighted back propagation optimization algorithm
  • ℓ-Norm regularized hidden layer

Fingerprint

Dive into the research topics of 'Joint nonlinear feature selection and continuous values regression network'. Together they form a unique fingerprint.

Cite this