A parameter-free self-training algorithm based on the three successive confirmation rule

Jikui Wang, Wei Zhao, Qingsheng Shang, Feiping Nie

Research output: Contribution to journalArticlepeer-review

Abstract

Semi-supervised learning is a popular research topic today, and self-training is a classical semi-supervised learning framework. How to select high-confidence samples in self-training is a critical step. However, the existing algorithms do not consider both global and local information of the data. In the paper, we propose a parameter-free self-training algorithm based on the three successive confirmation rule, which integrates global and local information to identify high-confidence samples. Concretely, the local information is obtained by using k nearest neighbors and global information is derived from the three successive confirmation rule. This dual selection strategy helps to improve the quality of high-confidence samples and further improve the performance of classification. We conduct experiments on 14 benchmark datasets, comparing our method with other self-training algorithms. We use accuracy and F-score as performance metrics. The experimental results demonstrate that our algorithm significantly improves classification performance, proving its effectiveness and superiority in semi-supervised learning.

Original languageEnglish
Article number110165
JournalEngineering Applications of Artificial Intelligence
Volume144
DOIs
StatePublished - 15 Mar 2025

Keywords

  • High-confidence samples
  • Self-training algorithm
  • Semi-supervised learning
  • Three successive confirmation rule

Fingerprint

Dive into the research topics of 'A parameter-free self-training algorithm based on the three successive confirmation rule'. Together they form a unique fingerprint.

Cite this