Joint Principal Component and Discriminant Analysis for Dimensionality Reduction

Xiaowei Zhao, Jun Guo, Feiping Nie, Ling Chen, Zhihui Li, Huaxiang Zhang

科研成果: 期刊稿件文章同行评审

45 引用 (Scopus)

摘要

Linear discriminant analysis (LDA) is the most widely used supervised dimensionality reduction approach. After removing the null space of the total scatter matrix St via principal component analysis (PCA), the LDA algorithm can avoid the small sample size problem. Most existing supervised dimensionality reduction methods extract the principal component of data first, and then conduct LDA on it. However, 'most variance' is very often the most important, but not always in PCA. Thus, this two-step strategy may not be able to obtain the most discriminant information for classification tasks. Different from traditional approaches which conduct PCA and LDA in sequence, we propose a novel method referred to as joint principal component and discriminant analysis (JPCDA) for dimensionality reduction. Using this method, we are able to not only avoid the small sample size problem but also extract discriminant information for classification tasks. An iterative optimization algorithm is proposed to solve the method. To validate the efficacy of the proposed method, we perform extensive experiments on several benchmark data sets in comparison with some state-of-the-art dimensionality reduction methods. A large number of experimental results illustrate that the proposed method has quite promising classification performance.

源语言英语
文章编号8718522
页(从-至)433-444
页数12
期刊IEEE Transactions on Neural Networks and Learning Systems
31
2
DOI
出版状态已出版 - 2月 2020

指纹

探究 'Joint Principal Component and Discriminant Analysis for Dimensionality Reduction' 的科研主题。它们共同构成独一无二的指纹。

引用此