Self-weighted discriminative feature selection via adaptive redundancy minimization

Tong Wu, Yicang Zhou, Rui Zhang, Yanni Xiao, Feiping Nie

Research output: Contribution to journalArticlepeer-review

10 Scopus citations

Abstract

In this paper, a novel self-weighted orthogonal linear discriminant analysis (SOLDA) method is firstly proposed, such that optimal weight can be automatically achieved to balance both between-class and within-class scatter matrices. Since correlated features tend to have similar rankings, multiple adopted criteria might lead to the state that top ranked features are selected with large correlations, such that redundant information is brought about. To minimize associated redundancy, an original regularization term is introduced to the proposed SOLDA problem to penalize the high-correlated features. Different from other methods and techniques, we optimize redundancy matrix as a variable instead of setting it as a priori, such that correlations among all the features can be adaptively evaluated. Additionally, a brand new recursive method is derived to achieve the selection matrix heuristically, such that closed form solution can be obtained with holding the orthogonality. Consequently, self-weighted discriminative feature selection via adaptive redundancy minimization (SDFS-ARM) method can be summarized, such that non-redundant discriminative features could be selected correspondingly. Eventually, the effectiveness of the proposed SDFS-ARM method is further validated by the empirical results.

Original languageEnglish
Pages (from-to)2824-2830
Number of pages7
JournalNeurocomputing
Volume275
DOIs
StatePublished - 31 Jan 2018

Keywords

  • Adaptive correlation
  • linear discriminant analysis
  • Redundancy minimization
  • Self-adaptive weight
  • Supervised feature selection

Fingerprint

Dive into the research topics of 'Self-weighted discriminative feature selection via adaptive redundancy minimization'. Together they form a unique fingerprint.

Cite this