Reversible Modal Conversion Model for Thermal Infrared Tracking

Yufei Zha, Fan Li, Huanyu Li, Peng Zhang, Wei Huang

Research output: Contribution to journalArticlepeer-review

Abstract

Learning powerful CNN representation of the target is a key issue for thermal infrared (TIR) tracking. The lack of massive training TIR data is one of the obstacles to training the network in an end-to-end way from the scratch. Compared to the time-consuming and labor-intensive method of heavily relabeling data, we obtain trainable TIR images by leveraging the massive annotated RGB images in this article. Unlike the traditional image generation models, a modal reversible module is designed to maximize the information propagation between RGB and TIR modals in this work. The advantage is that this module can preserve the modal information as possible when the network is conducted on a large number of aligned RGBT image pairs. Additionally, the fake-TIR features generated by the proposed module are also integrated to enhance the target representation ability when TIR tracking is on-the-fly. To verify the proposed method, we conduct sufficient experiments on both single-modal TIR and multimodal RGBT tracking datasets. In single-modal TIR tracking, the performance of our method is improved by 2.8% and 0.94% on success rate compared with the SOTA on LSOTB-TIR and PTB-TIR dataset. In multimodal RGBT fusion tracking, the proposed method is tested on the RGBT234 and VOT-RGBT2020 datasets and the results have also reached the performance of SOTA.

Original languageEnglish
Pages (from-to)8-24
Number of pages17
JournalIEEE Multimedia
Volume30
Issue number3
DOIs
StatePublished - 1 Jul 2023

Fingerprint

Dive into the research topics of 'Reversible Modal Conversion Model for Thermal Infrared Tracking'. Together they form a unique fingerprint.

Cite this