Nonlinear Feature Selection Neural Network via Structured Sparse Regularization

Rong Wang, Jintang Bian, Feiping Nie, Xuelong Li

科研成果: 期刊稿件文章同行评审

3 引用 (Scopus)

摘要

Feature selection is an important and effective data preprocessing method, which can remove the noise and redundant features while retaining the relevant and discriminative features in high-dimensional data. In real-world applications, the relationships between data samples and their labels are usually nonlinear. However, most of the existing feature selection models focus on learning a linear transformation matrix, which cannot capture such a nonlinear structure in practice and will degrade the performance of downstream tasks. To address the issue, we propose a novel nonlinear feature selection method to select those most relevant and discriminative features in high-dimensional dataset. Specifically, our method learns the nonlinear structure of high-dimensional data by a neural network with cross entropy loss function, and then using the structured sparsity norm such as ℓ2,p-norm to regularize the weights matrix connecting the input layer and the first hidden layer of the neural network model to learn weight of each feature. Therefore, a structural sparse weights matrix is obtained by conducting nonlinear learning based on a neural network with structured sparsity regularization. Then, we use the gradient descent method to achieve the optimal solution of the proposed model. Evaluating the experimental results on several synthetic datasets and real-world datasets shows the effectiveness and superiority of the proposed nonlinear feature selection model.

源语言英语
页(从-至)9493-9505
页数13
期刊IEEE Transactions on Neural Networks and Learning Systems
34
11
DOI
出版状态已出版 - 1 11月 2023

指纹

探究 'Nonlinear Feature Selection Neural Network via Structured Sparse Regularization' 的科研主题。它们共同构成独一无二的指纹。

引用此