Person Re-Identification with Triplet Focal Loss

Shizhou Zhang, Qi Zhang, Xing Wei, Yanning Zhang, Yong Xia

科研成果: 期刊稿件文章同行评审

33 引用 (Scopus)

摘要

Person re-identification (ReID), which aims at matching individuals across non-overlapping cameras, has attracted much attention in the field of computer vision due to its research significance and potential applications. Triplet loss-based CNN models have been very successful for person ReID, which aims to optimize the feature embedding space such that the distances between samples with the same identity are much shorter than those of samples with different identities. Researchers have found that hard triplets' mining is crucial for the success of the triplet loss. In this paper, motivated by focal loss designed for the classification model, we propose the triplet focal loss for person ReID. Triplet focal loss can up-weight the hard triplets' training samples and relatively down-weight the easy triplets adaptively via simply projecting the original distance in the Euclidean space to an exponential kernel space. We conduct experiments on three largest benchmark datasets currently available for person ReID, namely, Market-1501, DukeMTMC-ReID, and CUHK03, and the experimental results verify that the proposed triplet focal loss can greatly outperform the traditional triplet loss and achieve competitive performances with the representative state-of-the-art methods.

源语言英语
文章编号8558553
页(从-至)78092-78099
页数8
期刊IEEE Access
6
DOI
出版状态已出版 - 2018

指纹

探究 'Person Re-Identification with Triplet Focal Loss' 的科研主题。它们共同构成独一无二的指纹。

引用此