TY - JOUR
T1 - Semi-supervised Gaussian process latent variable model with pairwise constraints
AU - Wang, Xiumei
AU - Gao, Xinbo
AU - Yuan, Yuan
AU - Tao, Dacheng
AU - Li, Jie
PY - 2010/6
Y1 - 2010/6
N2 - In machine learning, Gaussian process latent variable model (GP-LVM) has been extensively applied in the field of unsupervised dimensionality reduction. When some supervised information, e.g., pairwise constraints or labels of the data, is available, the traditional GP-LVM cannot directly utilize such supervised information to improve the performance of dimensionality reduction. In this case, it is necessary to modify the traditional GP-LVM to make it capable of handing the supervised or semi-supervised learning tasks. For this purpose, we propose a new semi-supervised GP-LVM framework under the pairwise constraints. Through transferring the pairwise constraints in the observed space to the latent space, the constrained priori information on the latent variables can be obtained. Under this constrained priori, the latent variables are optimized by the maximum a posteriori (MAP) algorithm. The effectiveness of the proposed algorithm is demonstrated with experiments on a variety of data sets.
AB - In machine learning, Gaussian process latent variable model (GP-LVM) has been extensively applied in the field of unsupervised dimensionality reduction. When some supervised information, e.g., pairwise constraints or labels of the data, is available, the traditional GP-LVM cannot directly utilize such supervised information to improve the performance of dimensionality reduction. In this case, it is necessary to modify the traditional GP-LVM to make it capable of handing the supervised or semi-supervised learning tasks. For this purpose, we propose a new semi-supervised GP-LVM framework under the pairwise constraints. Through transferring the pairwise constraints in the observed space to the latent space, the constrained priori information on the latent variables can be obtained. Under this constrained priori, the latent variables are optimized by the maximum a posteriori (MAP) algorithm. The effectiveness of the proposed algorithm is demonstrated with experiments on a variety of data sets.
KW - Dimensionality reduction
KW - Gaussian process latent variable model
KW - Pairwise constraints
KW - Semi-supervised learning
UR - http://www.scopus.com/inward/record.url?scp=77952543650&partnerID=8YFLogxK
U2 - 10.1016/j.neucom.2010.01.021
DO - 10.1016/j.neucom.2010.01.021
M3 - 文章
AN - SCOPUS:77952543650
SN - 0925-2312
VL - 73
SP - 2186
EP - 2195
JO - Neurocomputing
JF - Neurocomputing
IS - 10-12
ER -