Simultaneously Learning Neighborship and Projection Matrix for Supervised Dimensionality Reduction

Yanwei Pang, Bo Zhou, Feiping Nie

科研成果: 期刊稿件文章同行评审

40 引用 (Scopus)

摘要

Explicitly or implicitly, most dimensionality reduction methods need to determine which samples are neighbors and the similarities between the neighbors in the original high-dimensional space. The projection matrix is then learnt on the assumption that the neighborhood information, e.g., the similarities, are known and fixed prior to learning. However, it is difficult to precisely measure the intrinsic similarities of samples in high-dimensional space because of the curse of dimensionality. Consequently, the neighbors selected according to such similarities and the projection matrix obtained according to such similarities and the corresponding neighbors might not be optimal in the sense of classification and generalization. To overcome this drawback, in this paper, we propose to let the similarities and neighbors be variables and model these in a low-dimensional space. Both the optimal similarity and projection matrix are obtained by minimizing a unified objective function. Nonnegative and sum-to-one constraints on the similarity are adopted. Instead of empirically setting the regularization parameter, we treat it as a variable to be optimized. It is interesting that the optimal regularization parameter is adaptive to the neighbors in a low-dimensional space and has an intuitive meaning. Experimental results on the YALE B, COIL-100, and MNIST data sets demonstrate the effectiveness of the proposed method.

源语言英语
页(从-至)2779-2793
页数15
期刊IEEE Transactions on Neural Networks and Learning Systems
30
9
DOI
出版状态已出版 - 1 9月 2019

指纹

探究 'Simultaneously Learning Neighborship and Projection Matrix for Supervised Dimensionality Reduction' 的科研主题。它们共同构成独一无二的指纹。

引用此