Asymmetric low-rank double-level cooperation for scalable discrete cross-modal hashing

Ruihan Chen, Junpeng Tan, Yinghong Zhou, Zhijing Yang, Feiping Nie, Tianshui Chen

科研成果: 期刊稿件文章同行评审

5 引用 (Scopus)

摘要

As an efficient information retrieval technique, cross-modal hashing has received increasing attention. However, the remaining challenges include (1) designing effective kernelization techniques to sufficiently strengthen the connections among different samples in kernel space and improve the class separability of data; (2) learning robust compact common representations to fully extract the correlation information between different modalities; and (3) fully leveraging underlying semantic information and embedding it into optimal hash codes. To conquer these challenges, we propose a novel algorithm called Asymmetric Low-rank Double-level Cooperation Hashing (ALDCH). First, we propose a novel Nonlinear Normalized Space Kernelization submodule (NNSK) to obtain the high-quality kernelized features, which can not only capture the more powerful nonlinear structure representations but also better express the nonlinear intra-modal correlations among the original features. To learn high-quality compact representations, we further propose a novel Low-rank Double-level Cooperation Mapping submodule (LDCM) with the L21-norm constraint, which can enhance the correlation of the coefficient spaces from different modalities and enable the samples to learn constrained and compact hash representations. Besides, our proposed method fully utilizes the underlying semantic label information by introducing the Semantic Pairwise Correlation Learning submodule (SPCL). Extensive experiments conducted on benchmark datasets demonstrate the accuracy and efficiency of ALDCH, which outperforms many state-of-the-art methods.

源语言英语
文章编号121703
期刊Expert Systems with Applications
237
DOI
出版状态已出版 - 1 3月 2024

指纹

探究 'Asymmetric low-rank double-level cooperation for scalable discrete cross-modal hashing' 的科研主题。它们共同构成独一无二的指纹。

引用此