Abstract
A large family of algorithms for unsupervised dimension reduction is based on both the local and global structures of the data. A fundamental step in these methods is to model the local geometrical structure of the data. However, the previous methods mainly ignore two facts in this step: 1) the dimensionality of the data is usually far larger than the number of local data, which is a typical ill-posed problem and 2) the data might be polluted by noise. These facts normally may lead to an inaccurate learned local structure and may degrade the final performance. In this paper, we propose a novel unsupervised dimension reduction method with the ability to address these problems effectively while also preserving the global information of the input data. Specifically, we first denoise the local data by preserving their principal components and we then apply a regularization term to the local modeling function to solve the illposed problem. Then, we use a linear regression model to capture the local geometrical structure, which is demonstrated to be insensitive to the parameters. Finally, we propose two criteria to simultaneously model both the local and the global information. Theoretical analyses for the relations between the proposed methods and some classical dimension-reduction methods are presented. The experimental results from various databases demonstrate the effectiveness of our methods.
Original language | English |
---|---|
Article number | 8248662 |
Pages (from-to) | 4882-4893 |
Number of pages | 12 |
Journal | IEEE Transactions on Neural Networks and Learning Systems |
Volume | 29 |
Issue number | 10 |
DOIs | |
State | Published - Oct 2018 |
Keywords
- Dimension reduction
- feature extraction
- manifold learning
- unsupervised learning