An adaptive self-correction joint training framework for person re-identification with noisy labels

Hui Fu, Ke Zhang, Jingyu Wang

Research output: Contribution to journalArticlepeer-review

9 Scopus citations

Abstract

Current person re-identification (ReID) methods heavily rely on well-annotated training data, and their performance suffers from significant degradation in the presence of noisy labels that are ubiquitous in real-life scenes. The reason is that noisy labels not only affect the prediction results of the classifier, but also impede feature refinement, making it difficult to distinguish between different person features. To address these issues, we propose an Adaptive Self-correction Classification (ASC) loss and an Adaptive Margin Self-correction Triplet (AMSTri) loss. Specifically, ASC loss helps the network to produce better predictions by balancing annotations and prediction labels, and pays more attention to the minority samples with the help of a focusing factor. On the other hand, the AMSTri loss introduces an adaptive margin that varies with sample features to accommodate complex data variations, and utilizes predicted labels to generate reliable triples for feature refinement. We then present an end-to-end adaptive self-correction joint training framework incorporating ASC loss and AMSTri loss to achieve a robust ReID model. Our comprehensive experiments demonstrate that the proposed framework outperforms most existing counterparts.

Original languageEnglish
Article number121771
JournalExpert Systems with Applications
Volume238
DOIs
StatePublished - 15 Mar 2024

Keywords

  • Adaptive margin self-correction triplet loss
  • Adaptive self-correction classification loss
  • Joint learning
  • Noisy labels

Fingerprint

Dive into the research topics of 'An adaptive self-correction joint training framework for person re-identification with noisy labels'. Together they form a unique fingerprint.

Cite this