TY - JOUR
T1 - Robust and Efficient Linear Discriminant Analysis with L2,1-Norm for Feature Selection
AU - Yang, Libo
AU - Liu, Xuemei
AU - Nie, Feiping
AU - Liu, Yang
N1 - Publisher Copyright:
© 2013 IEEE.
PY - 2020
Y1 - 2020
N2 - Feature selection and feature transformation are the two main approaches to reduce dimensionality, and they are often presented separately. In this study, a novel robust and efficient feature selection method, called FS-VLDA-L21 (feature selection based on variant of linear discriminant analysis and L{2,1} -norm), is proposed by combining a new variant of linear discriminant analysis and L{2,1} sparsity regularization. Here, feature transformation and feature selection are integrated into a unified optimization objective. To obtain significant discriminative power between classes, all the data in the same class are expected to be regressed to a single vector, and the important task is to explore a transformation matrix such that the squared regression error is minimized. Therefore, we derive a new discriminant analysis from a novel view of least squares regression. In addition, we impose row sparsity on the transformation matrix through L{2,1} -norm regularized term to achieve feature selection. Consequently, the most discriminative features are selected, simultaneously eliminating the redundant ones. To address the L{2,1} -norm based optimization problem, we design a new efficient iterative re-weighted algorithm and prove its convergence. Extensive experimental results on four well-known datasets demonstrate the performance of our feature selection method.
AB - Feature selection and feature transformation are the two main approaches to reduce dimensionality, and they are often presented separately. In this study, a novel robust and efficient feature selection method, called FS-VLDA-L21 (feature selection based on variant of linear discriminant analysis and L{2,1} -norm), is proposed by combining a new variant of linear discriminant analysis and L{2,1} sparsity regularization. Here, feature transformation and feature selection are integrated into a unified optimization objective. To obtain significant discriminative power between classes, all the data in the same class are expected to be regressed to a single vector, and the important task is to explore a transformation matrix such that the squared regression error is minimized. Therefore, we derive a new discriminant analysis from a novel view of least squares regression. In addition, we impose row sparsity on the transformation matrix through L{2,1} -norm regularized term to achieve feature selection. Consequently, the most discriminative features are selected, simultaneously eliminating the redundant ones. To address the L{2,1} -norm based optimization problem, we design a new efficient iterative re-weighted algorithm and prove its convergence. Extensive experimental results on four well-known datasets demonstrate the performance of our feature selection method.
KW - Feature selection
KW - L2,1-regularization
KW - Linear discriminant analysis
KW - Sparsity regularization
UR - http://www.scopus.com/inward/record.url?scp=85082047487&partnerID=8YFLogxK
U2 - 10.1109/ACCESS.2020.2978287
DO - 10.1109/ACCESS.2020.2978287
M3 - 文章
AN - SCOPUS:85082047487
SN - 2169-3536
VL - 8
SP - 44100
EP - 44110
JO - IEEE Access
JF - IEEE Access
M1 - 9023949
ER -