TY - JOUR
T1 - Semisupervised learning using negative labels
AU - Hou, Chenping
AU - Nie, Feiping
AU - Wang, Fei
AU - Zhang, Changshui
AU - Wu, Yi
PY - 2011/3
Y1 - 2011/3
N2 - The problem of semisupervised learning has aroused considerable research interests in the past few years. Most of these methods aim to learn from a partially labeled dataset, i.e., they assume that the exact labels of some data are already known. In this paper, we propose to use a novel type of supervision information to guide the process of semisupervised learning, which indicates whether a point does not belong to a specific category. We call this kind of information negative label (NL) and propose a novel approach called NL propagation (NLP) to efficiently make use of this type of information to assist the process of semisupervised learning. Specifically, NLP assumes that nearby points should have similar class indicators. The data labels are propagated under the guidance of NL information and the geometric structure revealed by both labeled and unlabeled points, by employing some specified initialization and parameter matrices. The convergence analysis, out-of-sample extension, parameter determination, computational complexity, and relations to other approaches are presented. We also interpret the proposed approach within the framework of regularization. Promising experimental results on image, digit, spoken letter, and text classification tasks are provided to show the effectiveness of our method.
AB - The problem of semisupervised learning has aroused considerable research interests in the past few years. Most of these methods aim to learn from a partially labeled dataset, i.e., they assume that the exact labels of some data are already known. In this paper, we propose to use a novel type of supervision information to guide the process of semisupervised learning, which indicates whether a point does not belong to a specific category. We call this kind of information negative label (NL) and propose a novel approach called NL propagation (NLP) to efficiently make use of this type of information to assist the process of semisupervised learning. Specifically, NLP assumes that nearby points should have similar class indicators. The data labels are propagated under the guidance of NL information and the geometric structure revealed by both labeled and unlabeled points, by employing some specified initialization and parameter matrices. The convergence analysis, out-of-sample extension, parameter determination, computational complexity, and relations to other approaches are presented. We also interpret the proposed approach within the framework of regularization. Promising experimental results on image, digit, spoken letter, and text classification tasks are provided to show the effectiveness of our method.
KW - Label propagation
KW - negative labels
KW - pattern classification
KW - semisupervised learning
UR - http://www.scopus.com/inward/record.url?scp=79952187194&partnerID=8YFLogxK
U2 - 10.1109/TNN.2010.2099237
DO - 10.1109/TNN.2010.2099237
M3 - 文章
C2 - 21233049
AN - SCOPUS:79952187194
SN - 1045-9227
VL - 22
SP - 420
EP - 432
JO - IEEE Transactions on Neural Networks
JF - IEEE Transactions on Neural Networks
IS - 3
M1 - 5688242
ER -