Generalization bottleneck in deep metric learning

Zhanxuan Hu, Danyang Wu, Feiping Nie, Rong Wang

科研成果: 期刊稿件文章同行评审

9 引用 (Scopus)

摘要

Deep metric learning aims to learn a non-linear function that maps raw-data to a discriminative lower-dimensional embedding space, where semantically similar samples have larger similarity than dissimilar ones. Most existing approaches process each raw-data in two steps, by mapping the raw-data to a higher-dimensional feature space via a fixed backbone, followed by mapping the higher-dimensional feature space to a lower-dimensional embedding space via a linear layer. This paradigm, however, inevitably leads to a Generalization Bottleneck (GB) problem. Specifically, GB refers to a limitation that the generalization capacity of lower-dimensional embedding space is inferior to the higher-dimensional feature space in the test stage. To mitigate the capacity gap between feature space and embedding space, we propose to introduce a fully-learnable module, dubbed Relational Knowledge Preserving (RKP), that improves the generalization capacity of lower-dimensional embedding space by transferring the mutual similarity of instances. Our proposed RKP module can be integrated into a general deep metric learning approach. And, experiments conducted on different benchmarks show that it can significantly improve the performance of original model.

源语言英语
页(从-至)249-261
页数13
期刊Information Sciences
581
DOI
出版状态已出版 - 12月 2021

指纹

探究 'Generalization bottleneck in deep metric learning' 的科研主题。它们共同构成独一无二的指纹。

引用此