TY - JOUR
T1 - Discriminative least squares regression for multiclass classification and feature selection
AU - Xiang, Shiming
AU - Nie, Feiping
AU - Meng, Gaofeng
AU - Pan, Chunhong
AU - Zhang, Changshui
PY - 2012
Y1 - 2012
N2 - This paper presents a framework of discriminative least squares regression (LSR) for multiclass classification and feature selection. The core idea is to enlarge the distance between different classes under the conceptual framework of LSR. First, a technique called ε-dragging is introduced to force the regression targets of different classes moving along opposite directions such that the distances between classes can be enlarged. Then, the ε-draggings are integrated into the LSR model for multiclass classification. Our learning framework, referred to as discriminative LSR, has a compact model form, where there is no need to train two-class machines that are independent of each other. With its compact form, this model can be naturally extended for feature selection. This goal is achieved in terms of L2,1 norm of matrix, generating a sparse learning model for feature selection. The model for multiclass classification and its extension for feature selection are finally solved elegantly and efficiently. Experimental evaluation over a range of benchmark datasets indicates the validity of our method.
AB - This paper presents a framework of discriminative least squares regression (LSR) for multiclass classification and feature selection. The core idea is to enlarge the distance between different classes under the conceptual framework of LSR. First, a technique called ε-dragging is introduced to force the regression targets of different classes moving along opposite directions such that the distances between classes can be enlarged. Then, the ε-draggings are integrated into the LSR model for multiclass classification. Our learning framework, referred to as discriminative LSR, has a compact model form, where there is no need to train two-class machines that are independent of each other. With its compact form, this model can be naturally extended for feature selection. This goal is achieved in terms of L2,1 norm of matrix, generating a sparse learning model for feature selection. The model for multiclass classification and its extension for feature selection are finally solved elegantly and efficiently. Experimental evaluation over a range of benchmark datasets indicates the validity of our method.
KW - Feature selection
KW - least squares regression
KW - multiclass classification
KW - sparse learning
UR - http://www.scopus.com/inward/record.url?scp=84875878163&partnerID=8YFLogxK
U2 - 10.1109/TNNLS.2012.2212721
DO - 10.1109/TNNLS.2012.2212721
M3 - 文章
AN - SCOPUS:84875878163
SN - 2162-237X
VL - 23
SP - 1738
EP - 1754
JO - IEEE Transactions on Neural Networks and Learning Systems
JF - IEEE Transactions on Neural Networks and Learning Systems
IS - 11
M1 - 6298965
ER -