TY - JOUR
T1 - Joint Principal Component and Discriminant Analysis for Dimensionality Reduction
AU - Zhao, Xiaowei
AU - Guo, Jun
AU - Nie, Feiping
AU - Chen, Ling
AU - Li, Zhihui
AU - Zhang, Huaxiang
N1 - Publisher Copyright:
© 2012 IEEE.
PY - 2020/2
Y1 - 2020/2
N2 - Linear discriminant analysis (LDA) is the most widely used supervised dimensionality reduction approach. After removing the null space of the total scatter matrix St via principal component analysis (PCA), the LDA algorithm can avoid the small sample size problem. Most existing supervised dimensionality reduction methods extract the principal component of data first, and then conduct LDA on it. However, 'most variance' is very often the most important, but not always in PCA. Thus, this two-step strategy may not be able to obtain the most discriminant information for classification tasks. Different from traditional approaches which conduct PCA and LDA in sequence, we propose a novel method referred to as joint principal component and discriminant analysis (JPCDA) for dimensionality reduction. Using this method, we are able to not only avoid the small sample size problem but also extract discriminant information for classification tasks. An iterative optimization algorithm is proposed to solve the method. To validate the efficacy of the proposed method, we perform extensive experiments on several benchmark data sets in comparison with some state-of-the-art dimensionality reduction methods. A large number of experimental results illustrate that the proposed method has quite promising classification performance.
AB - Linear discriminant analysis (LDA) is the most widely used supervised dimensionality reduction approach. After removing the null space of the total scatter matrix St via principal component analysis (PCA), the LDA algorithm can avoid the small sample size problem. Most existing supervised dimensionality reduction methods extract the principal component of data first, and then conduct LDA on it. However, 'most variance' is very often the most important, but not always in PCA. Thus, this two-step strategy may not be able to obtain the most discriminant information for classification tasks. Different from traditional approaches which conduct PCA and LDA in sequence, we propose a novel method referred to as joint principal component and discriminant analysis (JPCDA) for dimensionality reduction. Using this method, we are able to not only avoid the small sample size problem but also extract discriminant information for classification tasks. An iterative optimization algorithm is proposed to solve the method. To validate the efficacy of the proposed method, we perform extensive experiments on several benchmark data sets in comparison with some state-of-the-art dimensionality reduction methods. A large number of experimental results illustrate that the proposed method has quite promising classification performance.
KW - Joint principal component and discriminant analysis (JPCDA)
KW - small sample size problem
KW - the most discriminant information
KW - the null space of total scatter matrix
UR - http://www.scopus.com/inward/record.url?scp=85079102392&partnerID=8YFLogxK
U2 - 10.1109/TNNLS.2019.2904701
DO - 10.1109/TNNLS.2019.2904701
M3 - 文章
C2 - 31107663
AN - SCOPUS:85079102392
SN - 2162-237X
VL - 31
SP - 433
EP - 444
JO - IEEE Transactions on Neural Networks and Learning Systems
JF - IEEE Transactions on Neural Networks and Learning Systems
IS - 2
M1 - 8718522
ER -