Abstract
In this brief, we address the trace ratio (TR) problem for semi-supervised dimension reduction. We first reformulate the objective function of the recent work semi-supervised discriminant analysis (SDA) in a TR form. We also observe that in SDA the low-dimensional data representation F is constrained to be in the linear subspace spanned by the training data matrix X (i.e., F = X T W). In order to relax this hard constraint, we introduce a flexible regularizer ||F-XT W||2 which models the regression residual into the reformulated objective function. With such relaxation, our method referred to as TR based flexible SDA (TR-FSDA) can better cope with data sampled from a certain type of nonlinear manifold that is somewhat close to a linear subspace. In order to address the non-trivial optimization problem in TR-FSDA, we further develop an iterative algorithm to simultaneously solve for the low-dimensional data representation F and the projection matrix W. Moreover, we theoretically prove that our iterative algorithm converges to the optimum based on the Newton-Raphson method. The experiments on two face databases, one shape image database and one webpage database demonstrate that TR-FSDA outperforms the existing semi-supervised dimension reduction methods.
Original language | English |
---|---|
Article number | 6129431 |
Pages (from-to) | 519-526 |
Number of pages | 8 |
Journal | IEEE Transactions on Neural Networks and Learning Systems |
Volume | 23 |
Issue number | 3 |
DOIs | |
State | Published - 2012 |
Externally published | Yes |
Keywords
- Flexible semi-supervised discriminant analysis
- semi-supervised dimension reduction
- trace ratio