NHGMI: Heterogeneous graph multi-view infomax with node-wise contrasting samples selection

Qing Li, Hang Ni, Yuanchun Wang

科研成果: 期刊稿件文章同行评审

3 引用 (Scopus)

摘要

Heterogeneous Graph Neural Networks (HGNNs) have consistently demonstrated exceptional performance in the context of Heterogeneous Information Networks (HINs). However, the majority of existing methods are constrained by their applicability to semi-supervised learning scenarios, which entail substantial labor and resources. Recently, the advent of contrastive learning has catalyzed the development of self-supervised HGNN approaches. In this study, we introduce NHGMI, a novel unsupervised method for heterogeneous graph representation learning, rooted in the principles of mutual information maximization. NHGMI comprehensively harnesses the inherent characteristics of HINs and adeptly amalgamates diverse semantic information emanating from multiple contrasting perspectives. In addition to inter-view contrasts, NHGMI also incorporates intra-view contrasts. Diverging from conventional contrastive learning approaches, NHGMI meticulously selects contrasting samples based on similarity metrics, thereby achieving a noise-free contrastive learning paradigm. Furthermore, we propose an extension of NHGMI that focuses on the generation of negative samples. We conduct extensive experiments across various real-world datasets, and NHGMI consistently exhibits marked superiority when compared to relevant methods, even surpassing semi-supervised approaches. The source code for NHGMI is publicly available at https://github.com/Frederick-the-Fox/NHGMI.

源语言英语
文章编号111520
期刊Knowledge-Based Systems
289
DOI
出版状态已出版 - 8 4月 2024

指纹

探究 'NHGMI: Heterogeneous graph multi-view infomax with node-wise contrasting samples selection' 的科研主题。它们共同构成独一无二的指纹。

引用此