TY - GEN
T1 - Self-supervised Graph Neural Networks via Low-Rank Decomposition
AU - Yang, Liang
AU - Shi, Runjie
AU - Zhang, Qiuliang
AU - Niu, Bingxin
AU - Wang, Zhen
AU - Cao, Xiaochun
AU - Wang, Chuan
N1 - Publisher Copyright:
© 2023 Neural information processing systems foundation. All rights reserved.
PY - 2023
Y1 - 2023
N2 - Self-supervised learning is introduced to train graph neural networks (GNNs) by employing propagation-based GNNs designed for semi-supervised learning tasks.Unfortunately, this common choice tends to cause two serious issues.Firstly, global parameters cause the model lack the ability to capture the local property.Secondly, it is difficult to handle networks beyond homophily without label information.This paper tends to break through the common choice of employing propagation-based GNNs, which aggregate representations of nodes belonging to different classes and tend to lose discriminative information.If the propagation in each ego-network is just between the nodes from the same class, the obtained representation matrix should follow the low-rank characteristic.To meet this requirement, this paper proposes the Low-Rank Decomposition-based GNNs (LRD-GNN-Matrix) by employing Low-Rank Decomposition to the attribute matrix.Furthermore, to incorporate long-distance information, Low-Rank Tensor Decomposition-based GNN (LRD-GNN-Tensor) is proposed by constructing the node attribute tensor from selected similar ego-networks and performing Low-Rank Tensor Decomposition.The employed tensor nuclear norm facilitates the capture of the long-distance relationship between original and selected similar ego-networks.Extensive experiments demonstrate the superior performance and the robustness of LRD-GNNs.
AB - Self-supervised learning is introduced to train graph neural networks (GNNs) by employing propagation-based GNNs designed for semi-supervised learning tasks.Unfortunately, this common choice tends to cause two serious issues.Firstly, global parameters cause the model lack the ability to capture the local property.Secondly, it is difficult to handle networks beyond homophily without label information.This paper tends to break through the common choice of employing propagation-based GNNs, which aggregate representations of nodes belonging to different classes and tend to lose discriminative information.If the propagation in each ego-network is just between the nodes from the same class, the obtained representation matrix should follow the low-rank characteristic.To meet this requirement, this paper proposes the Low-Rank Decomposition-based GNNs (LRD-GNN-Matrix) by employing Low-Rank Decomposition to the attribute matrix.Furthermore, to incorporate long-distance information, Low-Rank Tensor Decomposition-based GNN (LRD-GNN-Tensor) is proposed by constructing the node attribute tensor from selected similar ego-networks and performing Low-Rank Tensor Decomposition.The employed tensor nuclear norm facilitates the capture of the long-distance relationship between original and selected similar ego-networks.Extensive experiments demonstrate the superior performance and the robustness of LRD-GNNs.
UR - http://www.scopus.com/inward/record.url?scp=85191176168&partnerID=8YFLogxK
M3 - 会议稿件
AN - SCOPUS:85191176168
T3 - Advances in Neural Information Processing Systems
BT - Advances in Neural Information Processing Systems 36 - 37th Conference on Neural Information Processing Systems, NeurIPS 2023
A2 - Oh, A.
A2 - Neumann, T.
A2 - Globerson, A.
A2 - Saenko, K.
A2 - Hardt, M.
A2 - Levine, S.
PB - Neural information processing systems foundation
T2 - 37th Conference on Neural Information Processing Systems, NeurIPS 2023
Y2 - 10 December 2023 through 16 December 2023
ER -