Abstract
This paper presents a framework of discriminative least squares regression (LSR) for multiclass classification and feature selection. The core idea is to enlarge the distance between different classes under the conceptual framework of LSR. First, a technique called ε-dragging is introduced to force the regression targets of different classes moving along opposite directions such that the distances between classes can be enlarged. Then, the ε-draggings are integrated into the LSR model for multiclass classification. Our learning framework, referred to as discriminative LSR, has a compact model form, where there is no need to train two-class machines that are independent of each other. With its compact form, this model can be naturally extended for feature selection. This goal is achieved in terms of L2,1 norm of matrix, generating a sparse learning model for feature selection. The model for multiclass classification and its extension for feature selection are finally solved elegantly and efficiently. Experimental evaluation over a range of benchmark datasets indicates the validity of our method.
Original language | English |
---|---|
Article number | 6298965 |
Pages (from-to) | 1738-1754 |
Number of pages | 17 |
Journal | IEEE Transactions on Neural Networks and Learning Systems |
Volume | 23 |
Issue number | 11 |
DOIs | |
State | Published - 2012 |
Externally published | Yes |
Keywords
- Feature selection
- least squares regression
- multiclass classification
- sparse learning