跳到主要导航 跳到搜索 跳到主要内容

Ground truth is the best teacher: supervised semantic segmentation inspired by knowledge transfer mechanisms

  • Xiangchun Yu
  • , Huofa Liu
  • , Dingwen Zhang
  • , Miaomiao Liang
  • , Lingjuan Yu
  • , Jian Zheng
  • Jiangxi University of Science and Technology

科研成果: 期刊稿件文章同行评审

摘要

Knowledge distillation typically requires additional distillation costs to improve model performance. In this paper, our focus lies in the straightforward construction of task-level losses by mimicking the knowledge transfer mechanism embedded in the existing logits-based knowledge distillation. Firstly, we put forward a method that enables direct knowledge transfer from the ground truth, with the aim of eliminating the supplementary costs linked to traditional distillation methods. Furthermore, we introduce a strategy to address the issue of overconfident softmax predictions that may emerge from this direct transfer. By applying a linear mapping to the ground truth, we can effectively regulate the model’s outputs and thus enhance the reliability of predictions. We carry out extensive experiments on the Cityscapes dataset, the Pascal Context dataset, ADE20K, and COCO Stuff164k, respectively. Both the experimental and visualization results illustrate that our proposed methods surpass the state-of-the-art KD methods in terms of training efficiency and segmentation performance.

源语言英语
文章编号41
期刊Multimedia Systems
31
1
DOI
出版状态已出版 - 2月 2025
已对外发布

指纹

探究 'Ground truth is the best teacher: supervised semantic segmentation inspired by knowledge transfer mechanisms' 的科研主题。它们共同构成独一无二的指纹。

引用此