MOLGAECL: Molecular Graph Contrastive Learning via Graph Auto-Encoder Pretraining and Fine-Tuning Based on Drug-Drug Interaction Prediction

Yu Li, Lin Xuan Hou, Hai Cheng Yi, Zhu Hong You, Shi Hong Chen, Jia Zheng, Yang Yuan, Cheng Gang Mi

Research output: Contribution to journalArticlepeer-review

Abstract

Drug-drug interactions influence drug efficacy and patient prognosis, providing substantial research value. Some existing methods struggle with the challenges posed by sparse networks or lack the capability to integrate data from multiple sources. In this study, we propose MOLGAECL, a novel approach based on graph autoencoder pretraining and molecular graph contrastive learning. Initially, a large number of unlabeled molecular graphs are pretrained using a graph autoencoder, where graph contrastive learning is applied for more accurate representation of the drugs. Subsequently, a full-parameter fine-tuning is performed on different data sets to adapt the model for drug interaction-related prediction tasks. To assess the effectiveness of MOLGAECL, comparison experiments with state-of-the-art methods, fine-tuning comparison experiments, and parameter sensitivity analysis are conducted. Extensive experimental results demonstrate the superior performance of MOLGAECL. Specifically, MOLGAECL achieves an average increase of 6.13% in accuracy, 6.14% in AUROC, and 8.16% in AUPRC across all data sets.

Original languageEnglish
JournalJournal of Chemical Information and Modeling
DOIs
StateAccepted/In press - 2025

Fingerprint

Dive into the research topics of 'MOLGAECL: Molecular Graph Contrastive Learning via Graph Auto-Encoder Pretraining and Fine-Tuning Based on Drug-Drug Interaction Prediction'. Together they form a unique fingerprint.

Cite this