MOLGAECL: Molecular Graph Contrastive Learning via Graph Auto-Encoder Pretraining and Fine-Tuning Based on Drug-Drug Interaction Prediction

Yu Li, Lin Xuan Hou, Hai Cheng Yi, Zhu Hong You, Shi Hong Chen, Jia Zheng, Yang Yuan, Cheng Gang Mi

科研成果: 期刊稿件文章同行评审

摘要

Drug-drug interactions influence drug efficacy and patient prognosis, providing substantial research value. Some existing methods struggle with the challenges posed by sparse networks or lack the capability to integrate data from multiple sources. In this study, we propose MOLGAECL, a novel approach based on graph autoencoder pretraining and molecular graph contrastive learning. Initially, a large number of unlabeled molecular graphs are pretrained using a graph autoencoder, where graph contrastive learning is applied for more accurate representation of the drugs. Subsequently, a full-parameter fine-tuning is performed on different data sets to adapt the model for drug interaction-related prediction tasks. To assess the effectiveness of MOLGAECL, comparison experiments with state-of-the-art methods, fine-tuning comparison experiments, and parameter sensitivity analysis are conducted. Extensive experimental results demonstrate the superior performance of MOLGAECL. Specifically, MOLGAECL achieves an average increase of 6.13% in accuracy, 6.14% in AUROC, and 8.16% in AUPRC across all data sets.

源语言英语
页(从-至)3104-3116
页数13
期刊Journal of Chemical Information and Modeling
65
6
DOI
出版状态已出版 - 24 3月 2025

指纹

探究 'MOLGAECL: Molecular Graph Contrastive Learning via Graph Auto-Encoder Pretraining and Fine-Tuning Based on Drug-Drug Interaction Prediction' 的科研主题。它们共同构成独一无二的指纹。

引用此