Dynamic Graph Representation Learning for Spatio-Temporal Neuroimaging Analysis

Rui Liu, Yao Hu, Jibin Wu, Ka Chun Wong, Zhi An Huang, Yu An Huang, Kay Chen Tan

科研成果: 期刊稿件文章同行评审

摘要

Neuroimaging analysis aims to reveal the information-processing mechanisms of the human brain in a noninvasive manner. In the past, graph neural networks (GNNs) have shown promise in capturing the non-Euclidean structure of brain networks. However, existing neuroimaging studies focused primarily on spatial functional connectivity, despite temporal dynamics in complex brain networks. To address this gap, we propose a spatio-temporal interactive graph representation framework (STIGR) for dynamic neuroimaging analysis that encompasses different aspects from classification and regression tasks to interpretation tasks. STIGR leverages a dynamic adaptive-neighbor graph convolution network to capture the interrelationships between spatial and temporal dynamics. To address the limited global scope in graph convolutions, a self-attention module based on Transformers is introduced to extract long-term dependencies. Contrastive learning is used to adaptively contrast similarities between adjacent scanning windows, modeling cross-temporal correlations in dynamic graphs. Extensive experiments on six public neuroimaging datasets demonstrate the competitive performance of STIGR across different platforms, achieving state-of-the-art results in classification and regression tasks. The proposed framework enables the detection of remarkable temporal association patterns between regions of interest based on sequential neuroimaging signals, offering medical professionals a versatile and interpretable tool for exploring task-specific neurological patterns. Our codes and models are available at https://github.com/77YQ77/STIGR/.

源语言英语
期刊IEEE Transactions on Cybernetics
DOI
出版状态已接受/待刊 - 2025

指纹

探究 'Dynamic Graph Representation Learning for Spatio-Temporal Neuroimaging Analysis' 的科研主题。它们共同构成独一无二的指纹。

引用此