TY - GEN
T1 - Graph Contrastive Learning with Kernel Dependence Maximization for Social Recommendation
AU - Ni, Xuelian
AU - Xiong, Fei
AU - Zheng, Yu
AU - Wang, Liang
N1 - Publisher Copyright:
© 2024 ACM.
PY - 2024/5/13
Y1 - 2024/5/13
N2 - Contrastive learning (CL) has recently catalyzed a productive avenue of research for recommendation. The efficacy of most CL methods for recommendation may hinge on their capacity to learn representation uniformity by mapping the data onto a hypersphere. Nonetheless, applying contrastive learning to downstream recommendation tasks remains challenging, as existing CL methods encounter difficulties in capturing the nonlinear dependence of representations in high-dimensional space and struggle to learn hierarchical social dependency among users-essential points for modeling user preferences. Moreover, the subtle distinctions between the augmented representations render CL methods sensitive to noise perturbations. Inspired by the Hilbert-Schmidt independence criterion (HSIC), we propose a graph Contrastive Learning model with Kernel Dependence Maximization CL-KDM for social recommendation to address these challenges. Specifically, to explicitly learn the kernel dependence of representations and improve the robustness and generalization of recommendation, we maximize the kernel dependence of augmented representations in kernel Hilbert space by introducing HSIC into the graph contrastive learning. Additionally, to simultaneously extract the hierarchical social dependency across users while preserving underlying structures, we design a hierarchical mutual information maximization module for generating augmented user representations, which are injected into the message passing of a graph neural network to enhance recommendation. Extensive experiments are conducted on three social recommendation datasets, and the results indicate that CL-KDM outperforms various baseline recommendation methods.
AB - Contrastive learning (CL) has recently catalyzed a productive avenue of research for recommendation. The efficacy of most CL methods for recommendation may hinge on their capacity to learn representation uniformity by mapping the data onto a hypersphere. Nonetheless, applying contrastive learning to downstream recommendation tasks remains challenging, as existing CL methods encounter difficulties in capturing the nonlinear dependence of representations in high-dimensional space and struggle to learn hierarchical social dependency among users-essential points for modeling user preferences. Moreover, the subtle distinctions between the augmented representations render CL methods sensitive to noise perturbations. Inspired by the Hilbert-Schmidt independence criterion (HSIC), we propose a graph Contrastive Learning model with Kernel Dependence Maximization CL-KDM for social recommendation to address these challenges. Specifically, to explicitly learn the kernel dependence of representations and improve the robustness and generalization of recommendation, we maximize the kernel dependence of augmented representations in kernel Hilbert space by introducing HSIC into the graph contrastive learning. Additionally, to simultaneously extract the hierarchical social dependency across users while preserving underlying structures, we design a hierarchical mutual information maximization module for generating augmented user representations, which are injected into the message passing of a graph neural network to enhance recommendation. Extensive experiments are conducted on three social recommendation datasets, and the results indicate that CL-KDM outperforms various baseline recommendation methods.
KW - contrastive learning
KW - data augmentation
KW - graph neural networks
KW - hilbert-schmidt independence criterion
KW - self-supervised learning
UR - http://www.scopus.com/inward/record.url?scp=85194047420&partnerID=8YFLogxK
U2 - 10.1145/3589334.3645412
DO - 10.1145/3589334.3645412
M3 - 会议稿件
AN - SCOPUS:85194047420
T3 - WWW 2024 - Proceedings of the ACM Web Conference
SP - 481
EP - 492
BT - WWW 2024 - Proceedings of the ACM Web Conference
PB - Association for Computing Machinery, Inc
T2 - 33rd ACM Web Conference, WWW 2024
Y2 - 13 May 2024 through 17 May 2024
ER -