TY - JOUR
T1 - Ratio Sum Versus Sum Ratio for Linear Discriminant Analysis
AU - Wang, Jingyu
AU - Wang, Hongmei
AU - Nie, Feiping
AU - Li, Xuelong
N1 - Publisher Copyright:
© 1979-2012 IEEE.
PY - 2022/12/1
Y1 - 2022/12/1
N2 - Dimension reduction is a critical technology for high-dimensional data processing, where Linear Discriminant Analysis (LDA) and its variants are effective supervised methods. However, LDA prefers to feature with smaller variance, which causes feature with weak discriminative ability retained. In this paper, we propose a novel Ratio Sum for Linear Discriminant Analysis (RSLDA), which aims at maximizing discriminative ability of each feature in subspace. To be specific, it maximizes the sum of ratio of the between-class distance to the within-class distance in each dimension of subspace. Since the original RSLDA problem is difficult to obtain the closed solution, an equivalent problem is developed which can be solved by an alternative optimization algorithm. For solving the equivalent problem, it is transformed into two sub-problems, one of which can be solved directly, the other is changed into a convex optimization problem, where singular value decomposition is employed instead of matrix inversion. Consequently, performance of algorithm cannot be affected by the non-singularity of covariance matrix. Furthermore, Kernel RSLDA (KRSLDA) is presented to improve the robustness of RSLDA. Additionally, time complexity of RSLDA and KRSLDA are analyzed. Extensive experiments show that RSLDA and KRSLDA outperforms other comparison methods on toy datasets and multiple public datasets.
AB - Dimension reduction is a critical technology for high-dimensional data processing, where Linear Discriminant Analysis (LDA) and its variants are effective supervised methods. However, LDA prefers to feature with smaller variance, which causes feature with weak discriminative ability retained. In this paper, we propose a novel Ratio Sum for Linear Discriminant Analysis (RSLDA), which aims at maximizing discriminative ability of each feature in subspace. To be specific, it maximizes the sum of ratio of the between-class distance to the within-class distance in each dimension of subspace. Since the original RSLDA problem is difficult to obtain the closed solution, an equivalent problem is developed which can be solved by an alternative optimization algorithm. For solving the equivalent problem, it is transformed into two sub-problems, one of which can be solved directly, the other is changed into a convex optimization problem, where singular value decomposition is employed instead of matrix inversion. Consequently, performance of algorithm cannot be affected by the non-singularity of covariance matrix. Furthermore, Kernel RSLDA (KRSLDA) is presented to improve the robustness of RSLDA. Additionally, time complexity of RSLDA and KRSLDA are analyzed. Extensive experiments show that RSLDA and KRSLDA outperforms other comparison methods on toy datasets and multiple public datasets.
KW - dimensionality reduction
KW - kernel technique
KW - linear discriminant analysis
KW - ratio sum problem
KW - subspace learning
KW - Supervised learning
UR - http://www.scopus.com/inward/record.url?scp=85121339512&partnerID=8YFLogxK
U2 - 10.1109/TPAMI.2021.3133351
DO - 10.1109/TPAMI.2021.3133351
M3 - 文章
C2 - 34874851
AN - SCOPUS:85121339512
SN - 0162-8828
VL - 44
SP - 10171
EP - 10185
JO - IEEE Transactions on Pattern Analysis and Machine Intelligence
JF - IEEE Transactions on Pattern Analysis and Machine Intelligence
IS - 12
ER -