TY - JOUR
T1 - Asymmetric low-rank double-level cooperation for scalable discrete cross-modal hashing
AU - Chen, Ruihan
AU - Tan, Junpeng
AU - Zhou, Yinghong
AU - Yang, Zhijing
AU - Nie, Feiping
AU - Chen, Tianshui
N1 - Publisher Copyright:
© 2023 Elsevier Ltd
PY - 2024/3/1
Y1 - 2024/3/1
N2 - As an efficient information retrieval technique, cross-modal hashing has received increasing attention. However, the remaining challenges include (1) designing effective kernelization techniques to sufficiently strengthen the connections among different samples in kernel space and improve the class separability of data; (2) learning robust compact common representations to fully extract the correlation information between different modalities; and (3) fully leveraging underlying semantic information and embedding it into optimal hash codes. To conquer these challenges, we propose a novel algorithm called Asymmetric Low-rank Double-level Cooperation Hashing (ALDCH). First, we propose a novel Nonlinear Normalized Space Kernelization submodule (NNSK) to obtain the high-quality kernelized features, which can not only capture the more powerful nonlinear structure representations but also better express the nonlinear intra-modal correlations among the original features. To learn high-quality compact representations, we further propose a novel Low-rank Double-level Cooperation Mapping submodule (LDCM) with the L21-norm constraint, which can enhance the correlation of the coefficient spaces from different modalities and enable the samples to learn constrained and compact hash representations. Besides, our proposed method fully utilizes the underlying semantic label information by introducing the Semantic Pairwise Correlation Learning submodule (SPCL). Extensive experiments conducted on benchmark datasets demonstrate the accuracy and efficiency of ALDCH, which outperforms many state-of-the-art methods.
AB - As an efficient information retrieval technique, cross-modal hashing has received increasing attention. However, the remaining challenges include (1) designing effective kernelization techniques to sufficiently strengthen the connections among different samples in kernel space and improve the class separability of data; (2) learning robust compact common representations to fully extract the correlation information between different modalities; and (3) fully leveraging underlying semantic information and embedding it into optimal hash codes. To conquer these challenges, we propose a novel algorithm called Asymmetric Low-rank Double-level Cooperation Hashing (ALDCH). First, we propose a novel Nonlinear Normalized Space Kernelization submodule (NNSK) to obtain the high-quality kernelized features, which can not only capture the more powerful nonlinear structure representations but also better express the nonlinear intra-modal correlations among the original features. To learn high-quality compact representations, we further propose a novel Low-rank Double-level Cooperation Mapping submodule (LDCM) with the L21-norm constraint, which can enhance the correlation of the coefficient spaces from different modalities and enable the samples to learn constrained and compact hash representations. Besides, our proposed method fully utilizes the underlying semantic label information by introducing the Semantic Pairwise Correlation Learning submodule (SPCL). Extensive experiments conducted on benchmark datasets demonstrate the accuracy and efficiency of ALDCH, which outperforms many state-of-the-art methods.
KW - Compact common representations
KW - Cross-modal hashing
KW - Double-level cooperation
KW - Nonlinear normalized space kernelization
UR - http://www.scopus.com/inward/record.url?scp=85172663622&partnerID=8YFLogxK
U2 - 10.1016/j.eswa.2023.121703
DO - 10.1016/j.eswa.2023.121703
M3 - 文章
AN - SCOPUS:85172663622
SN - 0957-4174
VL - 237
JO - Expert Systems with Applications
JF - Expert Systems with Applications
M1 - 121703
ER -